Show / Hide Table of Contents

Class RMSProp

Nesterov Adam optimizer.

Much like Adam is essentially RMSprop with momentum, Nadam is Adam RMSprop with Nesterov momentum. Default parameters follow those provided in the paper.It is recommended to leave the parameters of this optimizer at their default values.

Inheritance
System.Object
BaseOptimizer
RMSProp
Inherited Members
BaseOptimizer.Name
BaseOptimizer.LearningRate
BaseOptimizer.Momentum
BaseOptimizer.DecayRate
Namespace: SiaNet.Optimizers
Assembly: SiaNet.dll
Syntax
public class RMSProp : BaseOptimizer

Constructors

| Improve this Doc View Source

RMSProp(Single, Single, Single, Single)

Initializes a new instance of the RMSProp class.

Declaration
public RMSProp(float lr = 0.001F, float rho = 0.9F, float decayRate = 0F, float epsilon = 1E-07F)
Parameters
Type Name Description
System.Single lr

The initial learning rate for the optimizer.

System.Single rho

RMSProp decay factor, corresponding to fraction of gradient to keep at each time step.

System.Single decayRate

Learning rate decay over each update.

System.Single epsilon

Fuzz factor. Lowest float value but > 0

Properties

| Improve this Doc View Source

Epsilon

Fuzz factor. Lowest float value but > 0

Declaration
public float Epsilon { get; set; }
Property Value
Type Description
System.Single

The epsilon.

| Improve this Doc View Source

Rho

RMSProp decay factor, corresponding to fraction of gradient to keep at each time step.

Declaration
public float Rho { get; set; }
Property Value
Type Description
System.Single

The rho.

See Also

BaseOptimizer
  • Improve this Doc
  • View Source
Back to top Generated by DocFX