Show / Hide Table of Contents

Class BatchNormalization

Batch normalization layer (Ioffe and Szegedy, 2014). Normalize the activations of the previous layer at each batch, i.e.applies a transformation that maintains the mean activation close to 0 and the activation standard deviation close to 1.

Inheritance
System.Object
Keras
Base
BaseLayer
BatchNormalization
Implements
System.IDisposable
Inherited Members
BaseLayer.Set(BaseLayer[])
Base.Parameters
Base.None
Base.Init()
Base.ToPython()
Base.InvokeStaticMethod(Object, String, Dictionary<String, Object>)
Base.InvokeMethod(String, Dictionary<String, Object>)
Base.Item[String]
Keras.Instance
Keras.keras
Keras.keras2onnx
Keras.tfjs
Keras.Dispose()
Keras.ToTuple(Array)
Keras.ToList(Array)
System.Object.Equals(System.Object)
System.Object.Equals(System.Object, System.Object)
System.Object.GetHashCode()
System.Object.GetType()
System.Object.MemberwiseClone()
System.Object.ReferenceEquals(System.Object, System.Object)
System.Object.ToString()
Namespace: Keras.Layers
Assembly: Keras.dll
Syntax
public class BatchNormalization : BaseLayer, IDisposable

Constructors

| Improve this Doc View Source

BatchNormalization(Int32, Single, Single, Boolean, Boolean, String, String, String, String, String, String, String, String)

Initializes a new instance of the BatchNormalization class.

Declaration
public BatchNormalization(int axis = -1, float momentum = 0.99F, float epsilon = 0.001F, bool center = true, bool scale = true, string beta_initializer = "zeros", string gamma_initializer = "ones", string moving_mean_initializer = "zeros", string moving_variance_initializer = "ones", string beta_regularizer = "", string gamma_regularizer = "", string beta_constraint = "", string gamma_constraint = "")
Parameters
Type Name Description
System.Int32 axis

Integer, the axis that should be normalized (typically the features axis). For instance, after a Conv2D layer with data_format="channels_first", set axis=1 in BatchNormalization.

System.Single momentum

Momentum for the moving mean and the moving variance.

System.Single epsilon

Small float added to variance to avoid dividing by zero.

System.Boolean center

If True, add offset of beta to normalized tensor. If False, beta is ignored.

System.Boolean scale

If True, multiply by gamma. If False, gamma is not used. When the next layer is linear (also e.g. nn.relu), this can be disabled since the scaling will be done by the next layer.

System.String beta_initializer

Initializer for the beta weight.

System.String gamma_initializer

Initializer for the gamma weight.

System.String moving_mean_initializer

Initializer for the moving mean.

System.String moving_variance_initializer

Initializer for the moving variance.

System.String beta_regularizer

Optional regularizer for the beta weight.

System.String gamma_regularizer

Optional regularizer for the gamma weight.

System.String beta_constraint

Optional constraint for the beta weight.

System.String gamma_constraint

Optional constraint for the gamma weight

Implements

System.IDisposable
  • Improve this Doc
  • View Source
Back to top Generated by DocFX