Class Relu
Rectified Linear Unit.
With default values, it returns element-wise max(x, 0).
Otherwise, it follows: f(x) = max_value for x >= max_value, f(x) = x for threshold <= x<max_value, f(x) = alpha* (x - threshold) otherwise.
Inherited Members
Namespace: SiaNet.Layers.Activations
Assembly: SiaNet.dll
Syntax
public class Relu : BaseLayer
Constructors
| Improve this Doc View SourceRelu()
Initializes a new instance of the Relu class.
Declaration
public Relu()
Methods
| Improve this Doc View SourceBackward(Tensor)
Calculate the gradient of this layer function
Declaration
public override void Backward(Tensor outputgrad)
Parameters
Type | Name | Description |
---|---|---|
Tensor | outputgrad | The calculated output grad from previous layer. |
Overrides
| Improve this Doc View SourceForward(Tensor)
Forwards the inputs and compute the output
Declaration
public override void Forward(Tensor x)
Parameters
Type | Name | Description |
---|---|---|
Tensor | x | The input tensor for this layer. |