Show / Hide Table of Contents

Class Activations

Activations can either be used through an Activation layer, or through the activation argument supported by all forward layers:

Inheritance
System.Object
Keras
Base
Activations
Implements
System.IDisposable
Inherited Members
Base.Parameters
Base.None
Base.Init()
Base.ToPython()
Base.InvokeStaticMethod(Object, String, Dictionary<String, Object>)
Base.InvokeMethod(String, Dictionary<String, Object>)
Base.Item[String]
Keras.Instance
Keras.keras
Keras.keras2onnx
Keras.tfjs
Keras.Dispose()
Keras.ToTuple(Array)
Keras.ToList(Array)
System.Object.Equals(System.Object)
System.Object.Equals(System.Object, System.Object)
System.Object.GetHashCode()
System.Object.GetType()
System.Object.MemberwiseClone()
System.Object.ReferenceEquals(System.Object, System.Object)
System.Object.ToString()
Namespace: Keras
Assembly: Keras.dll
Syntax
public class Activations : Base, IDisposable

Methods

| Improve this Doc View Source

Elu(NDarray, Single)

Declaration
public static NDarray Elu(NDarray x, float alpha = 1F)
Parameters
Type Name Description
Numpy.NDarray x
System.Single alpha
Returns
Type Description
Numpy.NDarray
| Improve this Doc View Source

Exponential(NDarray)

Exponential (base e) activation function.

Declaration
public static NDarray Exponential(NDarray x)
Parameters
Type Name Description
Numpy.NDarray x

Input tensor.

Returns
Type Description
Numpy.NDarray

Output tensor

| Improve this Doc View Source

HardSigmoid(NDarray)

Hard sigmoid activation function. Faster to compute than sigmoid activation.

Declaration
public static NDarray HardSigmoid(NDarray x)
Parameters
Type Name Description
Numpy.NDarray x

Input tensor.

Returns
Type Description
Numpy.NDarray

Output tensor

| Improve this Doc View Source

Linear(NDarray)

Linear (i.e. identity) activation function.

Declaration
public static NDarray Linear(NDarray x)
Parameters
Type Name Description
Numpy.NDarray x

Input tensor.

Returns
Type Description
Numpy.NDarray

Output tensor

| Improve this Doc View Source

Relu(NDarray, Single, Nullable<Single>, Single)

Declaration
public static NDarray Relu(NDarray x, float alpha = 0F, float? max_value = default(float? ), float threshold = 0F)
Parameters
Type Name Description
Numpy.NDarray x
System.Single alpha
System.Nullable<System.Single> max_value
System.Single threshold
Returns
Type Description
Numpy.NDarray
| Improve this Doc View Source

Selu(NDarray)

Scaled Exponential Linear Unit (SELU). SELU is equal to: scale* elu(x, alpha), where alpha and scale are predefined constants.The values of alpha and scale are chosen so that the mean and variance of the inputs are preserved between two consecutive layers as long as the weights are initialized correctly(see lecun_normal initialization) and the number of inputs is "large enough" (see references for more information).

Declaration
public static NDarray Selu(NDarray x)
Parameters
Type Name Description
Numpy.NDarray x

Input tensor.

Returns
Type Description
Numpy.NDarray

The scaled exponential unit activation: scale * elu(x, alpha).

| Improve this Doc View Source

Sigmoid(NDarray)

Sigmoid activation function.

Declaration
public static NDarray Sigmoid(NDarray x)
Parameters
Type Name Description
Numpy.NDarray x

Input tensor.

Returns
Type Description
Numpy.NDarray

Tensor, output of sigmoid

| Improve this Doc View Source

Softmax(NDarray, Int32)

Softmax activation function

Declaration
public static NDarray Softmax(NDarray x, int axis = -1)
Parameters
Type Name Description
Numpy.NDarray x

The input tensor.

System.Int32 axis

Integer, axis along which the softmax normalization is applied.

Returns
Type Description
Numpy.NDarray

Tensor, output of softmax transformation.

| Improve this Doc View Source

Softplus(NDarray)

Softplus activation function. The softplus activation: log(exp(x) + 1).

Declaration
public static NDarray Softplus(NDarray x)
Parameters
Type Name Description
Numpy.NDarray x

Input tensor.

Returns
Type Description
Numpy.NDarray

Output tensor

| Improve this Doc View Source

Softsign(NDarray)

The softsign activation: x / (abs(x) + 1).

Declaration
public static NDarray Softsign(NDarray x)
Parameters
Type Name Description
Numpy.NDarray x

Input tensor.

Returns
Type Description
Numpy.NDarray

Output tensor

| Improve this Doc View Source

Tanh(NDarray)

Hyperbolic tangent activation function.

Declaration
public static NDarray Tanh(NDarray x)
Parameters
Type Name Description
Numpy.NDarray x

Input tensor.

Returns
Type Description
Numpy.NDarray

Output tensor

Implements

System.IDisposable
  • Improve this Doc
  • View Source
Back to top Generated by DocFX