Namespace SiaNet.Layers
Classes
AvgPooling1D
AvgPooling2D
AvgPooling3D
BaseLayer
Base class for the layers
BatchNormalization
Batch normalization layer (Ioffe and Szegedy, 2014).
Normalize the activations of the previous layer at each batch, i.e.applies a transformation that maintains the mean activation close to 0 and the activation standard deviation close to 1.
Conv1D
Conv2D
Conv2DTranspose
Conv3D
Conv3DTranspose
Dense
Dense implements the operation: output = activation(dot(input, kernel) + bias) where activation is the element-wise activation function passed as the activation argument, kernel is a weights matrix created by the layer, and bias is a bias vector created by the layer (only applicable if use_bias is True).
Dropout
Applies Dropout to the input. Dropout consists in randomly setting a fraction rate of input units to 0 at each update during training time, which helps prevent overfitting.
Embedding
Turns positive integers (indexes) into dense vectors of fixed size. eg. [[4], [20]] -> [[0.25, 0.1], [0.6, -0.2]]
This layer can only be used as the first layer in a model.
Flatten
Flattens the input. Does not affect the batch size.
GlobalPooling1D
GlobalPooling2D
GlobalPooling3D
MaxPooling1D
MaxPooling2D
MaxPooling3D
Permute
Permutes the dimensions of the input according to a given pattern. Useful for e.g.connecting RNNs and convnets together.
Repeat
Repeat the input for specified number of times, for an axis
Reshape
Reshapes an output to a certain shape.