Show / Hide Table of Contents

Class Adam

Adam optimizer. Default parameters follow those provided in the original paper.

Inheritance
System.Object
Keras
Base
Adam
Implements
System.IDisposable
Inherited Members
Base.Parameters
Base.None
Base.Init()
Base.ToPython()
Base.InvokeStaticMethod(Object, String, Dictionary<String, Object>)
Base.InvokeMethod(String, Dictionary<String, Object>)
Base.Item[String]
Keras.Instance
Keras.keras
Keras.keras2onnx
Keras.tfjs
Keras.Dispose()
Keras.ToTuple(Array)
Keras.ToList(Array)
System.Object.Equals(System.Object)
System.Object.Equals(System.Object, System.Object)
System.Object.GetHashCode()
System.Object.GetType()
System.Object.MemberwiseClone()
System.Object.ReferenceEquals(System.Object, System.Object)
System.Object.ToString()
Namespace: Keras.Optimizers
Assembly: Keras.dll
Syntax
public class Adam : Base, IDisposable

Constructors

| Improve this Doc View Source

Adam(Single, Single, Single, Nullable<Single>, Single, Boolean)

Initializes a new instance of the Adam class.

Declaration
public Adam(float lr = 0.001F, float beta_1 = 0.9F, float beta_2 = 0.999F, float? epsilon = default(float? ), float decay = 0F, bool amsgrad = false)
Parameters
Type Name Description
System.Single lr

The lr.

System.Single beta_1

The beta 1.

System.Single beta_2

The beta 2.

System.Nullable<System.Single> epsilon

The epsilon.

System.Single decay

The decay.

System.Boolean amsgrad

boolean. Whether to apply the AMSGrad variant of this algorithm from the paper "On the Convergence of Adam and Beyond".

Implements

System.IDisposable
  • Improve this Doc
  • View Source
Back to top Generated by DocFX