Show / Hide Table of Contents

Class Adamax

Adamax optimizer from Adam paper's Section 7.

It is a variant of Adam based on the infinity norm.Default parameters follow those provided in the paper.

Inheritance
System.Object
BaseOptimizer
Adamax
Inherited Members
BaseOptimizer.Name
BaseOptimizer.LearningRate
BaseOptimizer.Momentum
BaseOptimizer.DecayRate
Namespace: SiaNet.Optimizers
Assembly: SiaNet.dll
Syntax
public class Adamax : BaseOptimizer

Constructors

| Improve this Doc View Source

Adamax(Single, Single, Single, Single)

Initializes a new instance of the Adamax class.

Declaration
public Adamax(float lr = 0.002F, float beta_1 = 0.9F, float beta_2 = 0.999F, float decayRate = 0F)
Parameters
Type Name Description
System.Single lr

The initial learning rate for the optimizer.

System.Single beta_1

The beta 1 value.

System.Single beta_2

The beta 2 value.

System.Single decayRate

Learning rate decay over each update.

Properties

| Improve this Doc View Source

Beta1

Gets or sets the beta 1 value.

Declaration
public float Beta1 { get; set; }
Property Value
Type Description
System.Single

The beta1.

| Improve this Doc View Source

Beta2

Gets or sets the beta 2 value.

Declaration
public float Beta2 { get; set; }
Property Value
Type Description
System.Single

The beta2.

See Also

BaseOptimizer
  • Improve this Doc
  • View Source
Back to top Generated by DocFX