Show / Hide Table of Contents

Class Adagrad

Adagrad is an optimizer with parameter-specific learning rates, which are adapted relative to how frequently a parameter gets updated during training. The more updates a parameter receives, the smaller the learning rate.

Inheritance
System.Object
Keras
Base
Adagrad
Implements
System.IDisposable
Inherited Members
Base.Parameters
Base.None
Base.Init()
Base.ToPython()
Base.InvokeStaticMethod(Object, String, Dictionary<String, Object>)
Base.InvokeMethod(String, Dictionary<String, Object>)
Base.Item[String]
Keras.Instance
Keras.keras
Keras.keras2onnx
Keras.tfjs
Keras.Dispose()
Keras.ToTuple(Array)
Keras.ToList(Array)
System.Object.Equals(System.Object)
System.Object.Equals(System.Object, System.Object)
System.Object.GetHashCode()
System.Object.GetType()
System.Object.MemberwiseClone()
System.Object.ReferenceEquals(System.Object, System.Object)
System.Object.ToString()
Namespace: Keras.Optimizers
Assembly: Keras.dll
Syntax
public class Adagrad : Base, IDisposable

Constructors

| Improve this Doc View Source

Adagrad(Single, Nullable<Single>, Single)

Initializes a new instance of the Adagrad class.

Declaration
public Adagrad(float lr = 0.01F, float? epsilon = default(float? ), float decay = 0F)
Parameters
Type Name Description
System.Single lr

float >= 0. Initial learning rate.

System.Nullable<System.Single> epsilon

float >= 0. If None, defaults to K.epsilon().

System.Single decay

float >= 0. Learning rate decay over each update..

Implements

System.IDisposable
  • Improve this Doc
  • View Source
Back to top Generated by DocFX