Class Selu
SELU is equal to: scale * elu(x, alpha), where alpha and scale are predefined constants. The values of alpha and scale are chosen so that the mean and variance of the inputs are preserved between two consecutive layers as long as the weights are initialized correctly (see lecun_normal initialization) and the number of inputs is "large enough"
Inherited Members
Namespace: SiaNet.Layers.Activations
Assembly: SiaNet.dll
Syntax
public class Selu : Elu
Constructors
| Improve this Doc View SourceSelu()
Initializes a new instance of the Selu class.
Declaration
public Selu()
Methods
| Improve this Doc View SourceBackward(Tensor)
Calculate the gradient of this layer function
Declaration
public override void Backward(Tensor outputgrad)
Parameters
Type | Name | Description |
---|---|---|
Tensor | outputgrad | The calculated output grad from previous layer. |
Overrides
| Improve this Doc View SourceForward(Tensor)
Forwards the inputs and compute the output
Declaration
public override void Forward(Tensor x)
Parameters
Type | Name | Description |
---|---|---|
Tensor | x | The input tensor for this layer. |