Show / Hide Table of Contents

Class Adagrad

Adagrad is an optimizer with parameter-specific learning rates, which are adapted relative to how frequently a parameter gets updated during training. The more updates a parameter receives, the smaller the learning rate.

Inheritance
System.Object
BaseOptimizer
Adagrad
Inherited Members
BaseOptimizer.Name
BaseOptimizer.LearningRate
BaseOptimizer.Momentum
BaseOptimizer.DecayRate
Namespace: SiaNet.Optimizers
Assembly: SiaNet.dll
Syntax
public class Adagrad : BaseOptimizer

Constructors

| Improve this Doc View Source

Adagrad(Single, Single, Single)

Initializes a new instance of the Adagrad class.

Declaration
public Adagrad(float lr = 0.01F, float decayRate = 0F, float epsilon = 1E-07F)
Parameters
Type Name Description
System.Single lr

Initial learning rate for the optimizer.

System.Single decayRate

Learning rate decay over each update.

System.Single epsilon

Fuzz factor. Lowest float value but > 0

Properties

| Improve this Doc View Source

Epsilon

Fuzz factor. Lowest float value but > 0

Declaration
public float Epsilon { get; set; }
Property Value
Type Description
System.Single

The epsilon.

See Also

BaseOptimizer
  • Improve this Doc
  • View Source
Back to top Generated by DocFX