Show / Hide Table of Contents

Class Selu

SELU is equal to: scale * elu(x, alpha), where alpha and scale are predefined constants. The values of alpha and scale are chosen so that the mean and variance of the inputs are preserved between two consecutive layers as long as the weights are initialized correctly (see lecun_normal initialization) and the number of inputs is "large enough"

Inheritance
System.Object
BaseLayer
Elu
Selu
Inherited Members
Elu.Alpha
BaseLayer.Params
BaseLayer.Input
BaseLayer.Output
BaseLayer.Name
BaseLayer.SkipPred
BaseLayer.Item[String]
BaseLayer.BuildParam(String, Int64[], DataType, BaseInitializer, BaseConstraint, BaseRegularizer, Boolean)
Namespace: SiaNet.Layers.Activations
Assembly: SiaNet.dll
Syntax
public class Selu : Elu

Constructors

| Improve this Doc View Source

Selu()

Initializes a new instance of the Selu class.

Declaration
public Selu()

Methods

| Improve this Doc View Source

Backward(Tensor)

Calculate the gradient of this layer function

Declaration
public override void Backward(Tensor outputgrad)
Parameters
Type Name Description
Tensor outputgrad

The calculated output grad from previous layer.

Overrides
Elu.Backward(Tensor)
| Improve this Doc View Source

Forward(Tensor)

Forwards the inputs and compute the output

Declaration
public override void Forward(Tensor x)
Parameters
Type Name Description
Tensor x

The input tensor for this layer.

Overrides
Elu.Forward(Tensor)

See Also

Elu
  • Improve this Doc
  • View Source
Back to top Generated by DocFX