Class BatchNormalization
Batch normalization layer (Ioffe and Szegedy, 2014).
Normalize the activations of the previous layer at each batch, i.e.applies a transformation that maintains the mean activation close to 0 and the activation standard deviation close to 1.
Inherited Members
Namespace: SiaNet.Layers
Assembly: SiaNet.dll
Syntax
public class BatchNormalization : BaseLayer
Constructors
| Improve this Doc View SourceBatchNormalization(Int32, Single, Single, Boolean, Boolean, BaseInitializer, BaseRegularizer, BaseConstraint, BaseInitializer, BaseRegularizer, BaseConstraint, BaseInitializer, BaseInitializer)
Initializes a new instance of the BatchNormalization class.
Declaration
public BatchNormalization(int axis = -1, float momentum = 0.99F, float epsilon = 0.001F, bool center = true, bool scale = true, BaseInitializer betaInitializer = null, BaseRegularizer betaRegularizer = null, BaseConstraint betaConstraint = null, BaseInitializer gammaInitializer = null, BaseRegularizer gammaRegularizer = null, BaseConstraint gammaConstraint = null, BaseInitializer movingMeanInitializer = null, BaseInitializer movingVarianceInitializer = null)
Parameters
Type | Name | Description |
---|---|---|
System.Int32 | axis | Integer, the axis that should be normalized (typically the features axis). For instance, after a Conv2D layer set axis=1 in BatchNormalization. |
System.Single | momentum | Momentum for the moving mean and the moving variance. |
System.Single | epsilon | Small float added to variance to avoid dividing by zero. |
System.Boolean | center | If True, add offset of beta to normalized tensor. If False, beta is ignored. |
System.Boolean | scale | If True, multiply by gamma. If False, gamma is not used. When the next layer is linear (also e.g. relu), this can be disabled since the scaling will be done by the next layer. |
BaseInitializer | betaInitializer | Initializer for beta weight matrix. |
BaseRegularizer | betaRegularizer | Regularizer function for beta weight matrix |
BaseConstraint | betaConstraint | Constraint function for beta weight matrix. |
BaseInitializer | gammaInitializer | Initializer for gamma weight matrix |
BaseRegularizer | gammaRegularizer | Regularizer function for gamma weight matrix. |
BaseConstraint | gammaConstraint | Constraint function for gamma weight matrix. |
BaseInitializer | movingMeanInitializer | Initializer for moving mean weight matrix. |
BaseInitializer | movingVarianceInitializer | Initializer for moving variance weight matrix. |
Properties
| Improve this Doc View SourceAxis
Integer, the axis that should be normalized (typically the features axis). For instance, after a Conv2D layer set axis=1 in BatchNormalization.
Declaration
public int Axis { get; set; }
Property Value
Type | Description |
---|---|
System.Int32 | The axis. |
BetaConstraint
Constraint function for beta weight matrix
Declaration
public BaseConstraint BetaConstraint { get; set; }
Property Value
Type | Description |
---|---|
BaseConstraint | The beta constraint. |
BetaInitializer
Initializer for beta weight matrix
Declaration
public BaseInitializer BetaInitializer { get; set; }
Property Value
Type | Description |
---|---|
BaseInitializer | The beta initializer. |
BetaRegularizer
Regularizer function for beta weight matrix
Declaration
public BaseRegularizer BetaRegularizer { get; set; }
Property Value
Type | Description |
---|---|
BaseRegularizer | The beta regularizer. |
Center
If True, add offset of beta to normalized tensor. If False, beta is ignored.
Declaration
public bool Center { get; set; }
Property Value
Type | Description |
---|---|
System.Boolean |
|
Epsilon
Small float added to variance to avoid dividing by zero.
Declaration
public float Epsilon { get; set; }
Property Value
Type | Description |
---|---|
System.Single | The epsilon. |
GammaConstraint
Constraint function for gamma weight matrix
Declaration
public BaseConstraint GammaConstraint { get; set; }
Property Value
Type | Description |
---|---|
BaseConstraint | The gamma constraint. |
GammaInitializer
Initializer for gamma weight matrix
Declaration
public BaseInitializer GammaInitializer { get; set; }
Property Value
Type | Description |
---|---|
BaseInitializer | The gamma initializer. |
GammaRegularizer
Regularizer function for gamma weight matrix
Declaration
public BaseRegularizer GammaRegularizer { get; set; }
Property Value
Type | Description |
---|---|
BaseRegularizer | The gamma regularizer. |
Momentum
Momentum for the moving mean and the moving variance.
Declaration
public float Momentum { get; set; }
Property Value
Type | Description |
---|---|
System.Single | The momentum. |
MovingMeanInitializer
Initializer for moving mean weight matrix
Declaration
public BaseInitializer MovingMeanInitializer { get; set; }
Property Value
Type | Description |
---|---|
BaseInitializer | The moving mean initializer. |
MovingVarianceInitializer
Initializer for moving variance weight matrix
Declaration
public BaseInitializer MovingVarianceInitializer { get; set; }
Property Value
Type | Description |
---|---|
BaseInitializer | The moving variance initializer. |
Scale
If True, multiply by gamma. If False, gamma is not used. When the next layer is linear (also e.g. relu), this can be disabled since the scaling will be done by the next layer.
Declaration
public bool Scale { get; set; }
Property Value
Type | Description |
---|---|
System.Boolean |
|
Methods
| Improve this Doc View SourceBackward(Tensor)
Calculate the gradient of this layer function
Declaration
public override void Backward(Tensor outputgrad)
Parameters
Type | Name | Description |
---|---|---|
Tensor | outputgrad | The calculated output grad from previous layer. |
Overrides
| Improve this Doc View SourceForward(Tensor)
Forwards the inputs and compute the output
Declaration
public override void Forward(Tensor x)
Parameters
Type | Name | Description |
---|---|---|
Tensor | x | The input tensor for this layer. |