Class RMSProp
Nesterov Adam optimizer.
Much like Adam is essentially RMSprop with momentum, Nadam is Adam RMSprop with Nesterov momentum. Default parameters follow those provided in the paper.It is recommended to leave the parameters of this optimizer at their default values.
Inherited Members
Namespace: SiaNet.Optimizers
Assembly: SiaNet.dll
Syntax
public class RMSProp : BaseOptimizer
Constructors
| Improve this Doc View SourceRMSProp(Single, Single, Single, Single)
Initializes a new instance of the RMSProp class.
Declaration
public RMSProp(float lr = 0.001F, float rho = 0.9F, float decayRate = 0F, float epsilon = 1E-07F)
Parameters
Type | Name | Description |
---|---|---|
System.Single | lr | The initial learning rate for the optimizer. |
System.Single | rho | RMSProp decay factor, corresponding to fraction of gradient to keep at each time step. |
System.Single | decayRate | Learning rate decay over each update. |
System.Single | epsilon | Fuzz factor. Lowest float value but > 0 |
Properties
| Improve this Doc View SourceEpsilon
Fuzz factor. Lowest float value but > 0
Declaration
public float Epsilon { get; set; }
Property Value
Type | Description |
---|---|
System.Single | The epsilon. |
Rho
RMSProp decay factor, corresponding to fraction of gradient to keep at each time step.
Declaration
public float Rho { get; set; }
Property Value
Type | Description |
---|---|
System.Single | The rho. |