Show / Hide Table of Contents

Class BatchNormalization

Batch normalization layer (Ioffe and Szegedy, 2014).

Normalize the activations of the previous layer at each batch, i.e.applies a transformation that maintains the mean activation close to 0 and the activation standard deviation close to 1.

Inheritance
System.Object
BaseLayer
BatchNormalization
Inherited Members
BaseLayer.Params
BaseLayer.Input
BaseLayer.Output
BaseLayer.Name
BaseLayer.SkipPred
BaseLayer.Item[String]
BaseLayer.BuildParam(String, Int64[], DataType, BaseInitializer, BaseConstraint, BaseRegularizer, Boolean)
Namespace: SiaNet.Layers
Assembly: SiaNet.dll
Syntax
public class BatchNormalization : BaseLayer

Constructors

| Improve this Doc View Source

BatchNormalization(Int32, Single, Single, Boolean, Boolean, BaseInitializer, BaseRegularizer, BaseConstraint, BaseInitializer, BaseRegularizer, BaseConstraint, BaseInitializer, BaseInitializer)

Initializes a new instance of the BatchNormalization class.

Declaration
public BatchNormalization(int axis = -1, float momentum = 0.99F, float epsilon = 0.001F, bool center = true, bool scale = true, BaseInitializer betaInitializer = null, BaseRegularizer betaRegularizer = null, BaseConstraint betaConstraint = null, BaseInitializer gammaInitializer = null, BaseRegularizer gammaRegularizer = null, BaseConstraint gammaConstraint = null, BaseInitializer movingMeanInitializer = null, BaseInitializer movingVarianceInitializer = null)
Parameters
Type Name Description
System.Int32 axis

Integer, the axis that should be normalized (typically the features axis). For instance, after a Conv2D layer set axis=1 in BatchNormalization.

System.Single momentum

Momentum for the moving mean and the moving variance.

System.Single epsilon

Small float added to variance to avoid dividing by zero.

System.Boolean center

If True, add offset of beta to normalized tensor. If False, beta is ignored.

System.Boolean scale

If True, multiply by gamma. If False, gamma is not used. When the next layer is linear (also e.g. relu), this can be disabled since the scaling will be done by the next layer.

BaseInitializer betaInitializer

Initializer for beta weight matrix.

BaseRegularizer betaRegularizer

Regularizer function for beta weight matrix

BaseConstraint betaConstraint

Constraint function for beta weight matrix.

BaseInitializer gammaInitializer

Initializer for gamma weight matrix

BaseRegularizer gammaRegularizer

Regularizer function for gamma weight matrix.

BaseConstraint gammaConstraint

Constraint function for gamma weight matrix.

BaseInitializer movingMeanInitializer

Initializer for moving mean weight matrix.

BaseInitializer movingVarianceInitializer

Initializer for moving variance weight matrix.

Properties

| Improve this Doc View Source

Axis

Integer, the axis that should be normalized (typically the features axis). For instance, after a Conv2D layer set axis=1 in BatchNormalization.

Declaration
public int Axis { get; set; }
Property Value
Type Description
System.Int32

The axis.

| Improve this Doc View Source

BetaConstraint

Constraint function for beta weight matrix

Declaration
public BaseConstraint BetaConstraint { get; set; }
Property Value
Type Description
BaseConstraint

The beta constraint.

| Improve this Doc View Source

BetaInitializer

Initializer for beta weight matrix

Declaration
public BaseInitializer BetaInitializer { get; set; }
Property Value
Type Description
BaseInitializer

The beta initializer.

| Improve this Doc View Source

BetaRegularizer

Regularizer function for beta weight matrix

Declaration
public BaseRegularizer BetaRegularizer { get; set; }
Property Value
Type Description
BaseRegularizer

The beta regularizer.

| Improve this Doc View Source

Center

If True, add offset of beta to normalized tensor. If False, beta is ignored.

Declaration
public bool Center { get; set; }
Property Value
Type Description
System.Boolean

true if center; otherwise, false.

| Improve this Doc View Source

Epsilon

Small float added to variance to avoid dividing by zero.

Declaration
public float Epsilon { get; set; }
Property Value
Type Description
System.Single

The epsilon.

| Improve this Doc View Source

GammaConstraint

Constraint function for gamma weight matrix

Declaration
public BaseConstraint GammaConstraint { get; set; }
Property Value
Type Description
BaseConstraint

The gamma constraint.

| Improve this Doc View Source

GammaInitializer

Initializer for gamma weight matrix

Declaration
public BaseInitializer GammaInitializer { get; set; }
Property Value
Type Description
BaseInitializer

The gamma initializer.

| Improve this Doc View Source

GammaRegularizer

Regularizer function for gamma weight matrix

Declaration
public BaseRegularizer GammaRegularizer { get; set; }
Property Value
Type Description
BaseRegularizer

The gamma regularizer.

| Improve this Doc View Source

Momentum

Momentum for the moving mean and the moving variance.

Declaration
public float Momentum { get; set; }
Property Value
Type Description
System.Single

The momentum.

| Improve this Doc View Source

MovingMeanInitializer

Initializer for moving mean weight matrix

Declaration
public BaseInitializer MovingMeanInitializer { get; set; }
Property Value
Type Description
BaseInitializer

The moving mean initializer.

| Improve this Doc View Source

MovingVarianceInitializer

Initializer for moving variance weight matrix

Declaration
public BaseInitializer MovingVarianceInitializer { get; set; }
Property Value
Type Description
BaseInitializer

The moving variance initializer.

| Improve this Doc View Source

Scale

If True, multiply by gamma. If False, gamma is not used. When the next layer is linear (also e.g. relu), this can be disabled since the scaling will be done by the next layer.

Declaration
public bool Scale { get; set; }
Property Value
Type Description
System.Boolean

true if scale; otherwise, false.

Methods

| Improve this Doc View Source

Backward(Tensor)

Calculate the gradient of this layer function

Declaration
public override void Backward(Tensor outputgrad)
Parameters
Type Name Description
Tensor outputgrad

The calculated output grad from previous layer.

Overrides
BaseLayer.Backward(Tensor)
| Improve this Doc View Source

Forward(Tensor)

Forwards the inputs and compute the output

Declaration
public override void Forward(Tensor x)
Parameters
Type Name Description
Tensor x

The input tensor for this layer.

Overrides
BaseLayer.Forward(Tensor)

See Also

BaseLayer
  • Improve this Doc
  • View Source
Back to top Generated by DocFX