Class Adagrad
Adagrad is an optimizer with parameter-specific learning rates, which are adapted relative to how frequently a parameter gets updated during training. The more updates a parameter receives, the smaller the learning rate.
Inherited Members
Namespace: SiaNet.Optimizers
Assembly: SiaNet.dll
Syntax
public class Adagrad : BaseOptimizer
Constructors
| Improve this Doc View SourceAdagrad(Single, Single, Single)
Initializes a new instance of the Adagrad class.
Declaration
public Adagrad(float lr = 0.01F, float decayRate = 0F, float epsilon = 1E-07F)
Parameters
Type | Name | Description |
---|---|---|
System.Single | lr | Initial learning rate for the optimizer. |
System.Single | decayRate | Learning rate decay over each update. |
System.Single | epsilon | Fuzz factor. Lowest float value but > 0 |
Properties
| Improve this Doc View SourceEpsilon
Fuzz factor. Lowest float value but > 0
Declaration
public float Epsilon { get; set; }
Property Value
Type | Description |
---|---|
System.Single | The epsilon. |