Show / Hide Table of Contents

Class RMSprop

RMSProp optimizer. It is recommended to leave the parameters of this optimizer at their default values (except the learning rate, which can be freely tuned). This optimizer is usually a good choice for recurrent neural networks.

Inheritance
System.Object
Keras
Base
RMSprop
Implements
System.IDisposable
Inherited Members
Base.Parameters
Base.None
Base.Init()
Base.ToPython()
Base.InvokeStaticMethod(Object, String, Dictionary<String, Object>)
Base.InvokeMethod(String, Dictionary<String, Object>)
Base.Item[String]
Keras.Instance
Keras.keras
Keras.keras2onnx
Keras.tfjs
Keras.Dispose()
Keras.ToTuple(Array)
Keras.ToList(Array)
System.Object.Equals(System.Object)
System.Object.Equals(System.Object, System.Object)
System.Object.GetHashCode()
System.Object.GetType()
System.Object.MemberwiseClone()
System.Object.ReferenceEquals(System.Object, System.Object)
System.Object.ToString()
Namespace: Keras.Optimizers
Assembly: Keras.dll
Syntax
public class RMSprop : Base, IDisposable

Constructors

| Improve this Doc View Source

RMSprop(Single, Single, Nullable<Single>, Single)

Initializes a new instance of the RMSprop class.

Declaration
public RMSprop(float lr = 0.01F, float rho = 0.9F, float? epsilon = default(float? ), float decay = 0F)
Parameters
Type Name Description
System.Single lr

float >= 0. Learning rate.

System.Single rho

The rho factor. float > 0

System.Nullable<System.Single> epsilon

float >= 0. Fuzz factor. If None, defaults to K.epsilon().

System.Single decay

float >= 0. Learning rate decay over each update.

Implements

System.IDisposable
  • Improve this Doc
  • View Source
Back to top Generated by DocFX