DistributedGradients is a base class for distributed gradient ascent / descent.
Create new model object. If any of the arguments are nil, default argument values for that argument will be used.
DistributedGradients.new(gradientChangeMode: string): DistributedGradientObject
gradientChangeMode: Set what to do with the model parameters for a given gradient. Available options are:
Descent (Default)
Ascent
Set model’s parameters. When any of the arguments are nil, previous argument values for that argument will be used.
DistributedGradients:setParameters(gradientChangeMode: string)
gradientChangeMode: Set what to do with the model parameters for a given gradient. Available options are:
Descent
Ascent
DistributedGradients:addGradients(Gradients: any)
DistributedGradients:setMainModelParameters(ModelParameters: any, doNotDeepCopy: boolean)
ModelParameters: The model parameters for the main model.
doNotDeepCopy: Set whether or not to deep copy the model parameters.
DistributedGradients:getModelParameters(doNotDeepCopy: boolean): any
ModelParameters: The model parameters for the main model.
doNotDeepCopy: Set whether or not to deep copy the model parameters.
Creates a new thread for real-time gradient descent / ascent.
ModelParameters:start(): coroutine
Stops the threads for real-time training.
ModelParameters:stop()
Clears the stored gradients inside the DistributedGradients object.
DistributedGradients:clearGradients()
Destroys the model object.
DistributedGradients:destroy()