Show / Hide Table of Contents

Class KullbackLeiblerDivergence

KL Divergence, also known as relative entropy, information divergence/gain, is a measure of how one probability distribution diverges from a second expected probability distribution.

Inheritance
System.Object
BaseLoss
KullbackLeiblerDivergence
Inherited Members
BaseLoss.Name
Namespace: SiaNet.Losses
Assembly: SiaNet.dll
Syntax
public class KullbackLeiblerDivergence : BaseLoss

Constructors

| Improve this Doc View Source

KullbackLeiblerDivergence()

Initializes a new instance of the KullbackLeiblerDivergence class.

Declaration
public KullbackLeiblerDivergence()

Methods

| Improve this Doc View Source

Backward(Tensor, Tensor)

Backpropagation method to calculate gradient of the loss function

Declaration
public override Tensor Backward(Tensor preds, Tensor labels)
Parameters
Type Name Description
Tensor preds

The predicted result.

Tensor labels

The true result.

Returns
Type Description
Tensor
Overrides
BaseLoss.Backward(Tensor, Tensor)
| Improve this Doc View Source

Forward(Tensor, Tensor)

Forwards the inputs and calculate the loss.

Declaration
public override Tensor Forward(Tensor preds, Tensor labels)
Parameters
Type Name Description
Tensor preds

The predicted result.

Tensor labels

The true result.

Returns
Type Description
Tensor
Overrides
BaseLoss.Forward(Tensor, Tensor)

See Also

BaseLoss
  • Improve this Doc
  • View Source
Back to top Generated by DocFX