Show / Hide Table of Contents

Class Adadelta

Adadelta is a more robust extension of Adagrad that adapts learning rates based on a moving window of gradient updates, instead of accumulating all past gradients. This way, Adadelta continues learning even when many updates have been done. Compared to Adagrad, in the original version of Adadelta you don't have to set an initial learning rate. In this version, initial learning rate and decay factor can be set, as in most other optimizers.

Inheritance
System.Object
BaseOptimizer
Adadelta
Inherited Members
BaseOptimizer.Name
BaseOptimizer.LearningRate
BaseOptimizer.Momentum
BaseOptimizer.DecayRate
Namespace: SiaNet.Optimizers
Assembly: SiaNet.dll
Syntax
public class Adadelta : BaseOptimizer

Constructors

| Improve this Doc View Source

Adadelta(Single, Single, Single, Single)

Initializes a new instance of the Adadelta class.

Declaration
public Adadelta(float lr = 1F, float rho = 0.95F, float decayRate = 0F, float epsilon = 1E-07F)
Parameters
Type Name Description
System.Single lr

Initial learning rate, defaults to 1. It is recommended to leave it at the default value.

System.Single rho

Adadelta decay factor, corresponding to fraction of gradient to keep at each time step.

System.Single decayRate

Learning rate decay factor over each update.

System.Single epsilon

The epsilon.

Properties

| Improve this Doc View Source

Epsilon

Fuzz factor. Lowest float value but > 0

Declaration
public float Epsilon { get; set; }
Property Value
Type Description
System.Single

The epsilon.

| Improve this Doc View Source

Rho

Adadelta decay factor, corresponding to fraction of gradient to keep at each time step.

Declaration
public float Rho { get; set; }
Property Value
Type Description
System.Single

The rho.

See Also

BaseOptimizer
  • Improve this Doc
  • View Source
Back to top Generated by DocFX