Show / Hide Table of Contents

Class Elu

Exponential linear unit activation function: x if x > 0 and alpha * (exp(x)-1) if x < 0.

Inheritance
System.Object
BaseLayer
Elu
Selu
Inherited Members
BaseLayer.Params
BaseLayer.Input
BaseLayer.Output
BaseLayer.Name
BaseLayer.SkipPred
BaseLayer.Item[String]
BaseLayer.BuildParam(String, Int64[], DataType, BaseInitializer, BaseConstraint, BaseRegularizer, Boolean)
Namespace: SiaNet.Layers.Activations
Assembly: SiaNet.dll
Syntax
public class Elu : BaseLayer

Constructors

| Improve this Doc View Source

Elu(Single)

Initializes a new instance of the Elu class.

Declaration
public Elu(float alpha = 1F)
Parameters
Type Name Description
System.Single alpha

Slope of the negative section.

Properties

| Improve this Doc View Source

Alpha

Slope of the negative section

Declaration
public float Alpha { get; set; }
Property Value
Type Description
System.Single

The alpha.

Methods

| Improve this Doc View Source

Backward(Tensor)

Calculate the gradient of this layer function

Declaration
public override void Backward(Tensor outputgrad)
Parameters
Type Name Description
Tensor outputgrad

The calculated output grad from previous layer.

Overrides
BaseLayer.Backward(Tensor)
| Improve this Doc View Source

Forward(Tensor)

Forwards the inputs and compute the output

Declaration
public override void Forward(Tensor x)
Parameters
Type Name Description
Tensor x

The input tensor for this layer.

Overrides
BaseLayer.Forward(Tensor)

See Also

BaseLayer
  • Improve this Doc
  • View Source
Back to top Generated by DocFX