Show / Hide Table of Contents

Class LeakyRelu

Leaky version of a Rectified Linear Unit.

It allows a small gradient when the unit is not active: f(x) = alpha* x for x< 0, f(x) = x for x >= 0.

Inheritance
System.Object
BaseLayer
LeakyRelu
Inherited Members
BaseLayer.Params
BaseLayer.Input
BaseLayer.Output
BaseLayer.Name
BaseLayer.SkipPred
BaseLayer.Item[String]
BaseLayer.BuildParam(String, Int64[], DataType, BaseInitializer, BaseConstraint, BaseRegularizer, Boolean)
Namespace: SiaNet.Layers.Activations
Assembly: SiaNet.dll
Syntax
public class LeakyRelu : BaseLayer

Constructors

| Improve this Doc View Source

LeakyRelu(Single)

Initializes a new instance of the LeakyRelu class.

Declaration
public LeakyRelu(float alpha = 0.3F)
Parameters
Type Name Description
System.Single alpha

Negative slope coefficient.

Properties

| Improve this Doc View Source

Alpha

Negative slope coefficient..

Declaration
public float Alpha { get; set; }
Property Value
Type Description
System.Single

The alpha.

Methods

| Improve this Doc View Source

Backward(Tensor)

Calculate the gradient of this layer function

Declaration
public override void Backward(Tensor outputgrad)
Parameters
Type Name Description
Tensor outputgrad

The calculated output grad from previous layer.

Overrides
BaseLayer.Backward(Tensor)
| Improve this Doc View Source

Forward(Tensor)

Forwards the inputs and compute the output

Declaration
public override void Forward(Tensor x)
Parameters
Type Name Description
Tensor x

The input tensor for this layer.

Overrides
BaseLayer.Forward(Tensor)

See Also

BaseLayer
  • Improve this Doc
  • View Source
Back to top Generated by DocFX