Beta Version 2.11.0
Added
-
Added these value schedulers under the “ValueSchedulers” section:
-
Chained
-
Constant
-
CosineAnnealing
-
Exponential
-
InverseSquareRoot
-
InverseTime
-
Linear
-
MultipleStep
-
Multiplicative
-
Polynomial
-
Sequential
-
Step
-
-
Added these optimizers under the “Optimizers” section:
- ResilientBackwardPropagation
* AdaptiveMomentEstimationWeightDecay
- Added OneVsOne under the “Others” section.
Changes
-
ValueSchedulers can now be used in place of Optimizers for scheduling learning rate.
-
The calculate() function now accepts ModelParameters as the third parameter for all the optimizers under the “Optimizers” section.
-
Renamed AdaptiveGradientDelta to AdaptiveDelta under the “Optimizers” section.
-
Optimized NeuralNetwork code so that the activation function calculation is not performed for bias values.
Removed
-
Removed TimeDecay and StepDecay from the “ValueSchedulers” section.
-
Removed LearningRateStepDecay and LearningRateTimeDecay from the “Optimizers” section.
Fixes
- Fixed some bugs in AdaptiveMomentEstimation and NesterovAcceleratedAdaptiveMomentEstimation under the “Optimizers” section.