Class Adagrad
Adagrad is an optimizer with parameter-specific learning rates, which are adapted relative to how frequently a parameter gets updated during training. The more updates a parameter receives, the smaller the learning rate.
Implements
System.IDisposable
Inherited Members
System.Object.Equals(System.Object)
System.Object.Equals(System.Object, System.Object)
System.Object.GetHashCode()
System.Object.GetType()
System.Object.MemberwiseClone()
System.Object.ReferenceEquals(System.Object, System.Object)
System.Object.ToString()
Namespace: Keras.Optimizers
Assembly: Keras.dll
Syntax
public class Adagrad : Base, IDisposable
Constructors
| Improve this Doc View SourceAdagrad(Single, Nullable<Single>, Single)
Initializes a new instance of the Adagrad class.
Declaration
public Adagrad(float lr = 0.01F, float? epsilon = default(float? ), float decay = 0F)
Parameters
Type | Name | Description |
---|---|---|
System.Single | lr | float >= 0. Initial learning rate. |
System.Nullable<System.Single> | epsilon | float >= 0. If None, defaults to K.epsilon(). |
System.Single | decay | float >= 0. Learning rate decay over each update.. |
Implements
System.IDisposable