Show / Hide Table of Contents

Class LogCosh

Logarithm of the hyperbolic cosine of the prediction error.

log(cosh(x)) is approximately equal to(x** 2) / 2 for small x and to abs(x) - log(2) for large x. This means that 'logcosh' works mostly like the mean squared error, but will not be so strongly affected by the occasional wildly incorrect prediction.

Inheritance
System.Object
BaseLoss
LogCosh
Inherited Members
BaseLoss.Name
Namespace: SiaNet.Losses
Assembly: SiaNet.dll
Syntax
public class LogCosh : BaseLoss

Constructors

| Improve this Doc View Source

LogCosh()

Initializes a new instance of the LogCosh class.

Declaration
public LogCosh()

Methods

| Improve this Doc View Source

Backward(Tensor, Tensor)

Backpropagation method to calculate gradient of the loss function

Declaration
public override Tensor Backward(Tensor preds, Tensor labels)
Parameters
Type Name Description
Tensor preds

The predicted result.

Tensor labels

The true result.

Returns
Type Description
Tensor
Overrides
BaseLoss.Backward(Tensor, Tensor)
| Improve this Doc View Source

Forward(Tensor, Tensor)

Forwards the inputs and calculate the loss.

Declaration
public override Tensor Forward(Tensor preds, Tensor labels)
Parameters
Type Name Description
Tensor preds

The predicted result.

Tensor labels

The true result.

Returns
Type Description
Tensor
Overrides
BaseLoss.Forward(Tensor, Tensor)

See Also

BaseLoss
  • Improve this Doc
  • View Source
Back to top Generated by DocFX