Class Adamax
Adamax optimizer from Adam paper's Section 7.
It is a variant of Adam based on the infinity norm.Default parameters follow those provided in the paper.
Inherited Members
Namespace: SiaNet.Optimizers
Assembly: SiaNet.dll
Syntax
public class Adamax : BaseOptimizer
Constructors
| Improve this Doc View SourceAdamax(Single, Single, Single, Single)
Initializes a new instance of the Adamax class.
Declaration
public Adamax(float lr = 0.002F, float beta_1 = 0.9F, float beta_2 = 0.999F, float decayRate = 0F)
Parameters
Type | Name | Description |
---|---|---|
System.Single | lr | The initial learning rate for the optimizer. |
System.Single | beta_1 | The beta 1 value. |
System.Single | beta_2 | The beta 2 value. |
System.Single | decayRate | Learning rate decay over each update. |
Properties
| Improve this Doc View SourceBeta1
Gets or sets the beta 1 value.
Declaration
public float Beta1 { get; set; }
Property Value
Type | Description |
---|---|
System.Single | The beta1. |
Beta2
Gets or sets the beta 2 value.
Declaration
public float Beta2 { get; set; }
Property Value
Type | Description |
---|---|
System.Single | The beta2. |