Show / Hide Table of Contents

Class SGD

Stochastic gradient descent optimizer. Includes support for momentum, learning rate decay, and Nesterov momentum.

Inheritance
System.Object
Keras
Base
SGD
Implements
System.IDisposable
Inherited Members
Base.Parameters
Base.None
Base.Init()
Base.ToPython()
Base.InvokeStaticMethod(Object, String, Dictionary<String, Object>)
Base.InvokeMethod(String, Dictionary<String, Object>)
Base.Item[String]
Keras.Instance
Keras.keras
Keras.keras2onnx
Keras.tfjs
Keras.Dispose()
Keras.ToTuple(Array)
Keras.ToList(Array)
System.Object.Equals(System.Object)
System.Object.Equals(System.Object, System.Object)
System.Object.GetHashCode()
System.Object.GetType()
System.Object.MemberwiseClone()
System.Object.ReferenceEquals(System.Object, System.Object)
System.Object.ToString()
Namespace: Keras.Optimizers
Assembly: Keras.dll
Syntax
public class SGD : Base, IDisposable

Constructors

| Improve this Doc View Source

SGD(Single, Single, Single, Boolean)

Initializes a new instance of the SGD class.

Declaration
public SGD(float lr = 0.01F, float momentum = 0F, float decay = 0F, bool nesterov = false)
Parameters
Type Name Description
System.Single lr

float >= 0. Learning rate.

System.Single momentum

float >= 0. Parameter that accelerates SGD in the relevant direction and dampens oscillations.

System.Single decay

float >= 0. Learning rate decay over each update.

System.Boolean nesterov

boolean. Whether to apply Nesterov momentum.

Implements

System.IDisposable
  • Improve this Doc
  • View Source
Back to top Generated by DocFX