Class LeakyRelu
Leaky version of a Rectified Linear Unit.
It allows a small gradient when the unit is not active: f(x) = alpha* x for x< 0, f(x) = x for x >= 0.
Inherited Members
Namespace: SiaNet.Layers.Activations
Assembly: SiaNet.dll
Syntax
public class LeakyRelu : BaseLayer
Constructors
| Improve this Doc View SourceLeakyRelu(Single)
Initializes a new instance of the LeakyRelu class.
Declaration
public LeakyRelu(float alpha = 0.3F)
Parameters
Type | Name | Description |
---|---|---|
System.Single | alpha | Negative slope coefficient. |
Properties
| Improve this Doc View SourceAlpha
Negative slope coefficient..
Declaration
public float Alpha { get; set; }
Property Value
Type | Description |
---|---|
System.Single | The alpha. |
Methods
| Improve this Doc View SourceBackward(Tensor)
Calculate the gradient of this layer function
Declaration
public override void Backward(Tensor outputgrad)
Parameters
Type | Name | Description |
---|---|---|
Tensor | outputgrad | The calculated output grad from previous layer. |
Overrides
| Improve this Doc View SourceForward(Tensor)
Forwards the inputs and compute the output
Declaration
public override void Forward(Tensor x)
Parameters
Type | Name | Description |
---|---|---|
Tensor | x | The input tensor for this layer. |