Class KullbackLeiblerDivergence
KL Divergence, also known as relative entropy, information divergence/gain, is a measure of how one probability distribution diverges from a second expected probability distribution.
Inherited Members
Namespace: SiaNet.Losses
Assembly: SiaNet.dll
Syntax
public class KullbackLeiblerDivergence : BaseLoss
Constructors
| Improve this Doc View SourceKullbackLeiblerDivergence()
Initializes a new instance of the KullbackLeiblerDivergence class.
Declaration
public KullbackLeiblerDivergence()
Methods
| Improve this Doc View SourceBackward(Tensor, Tensor)
Backpropagation method to calculate gradient of the loss function
Declaration
public override Tensor Backward(Tensor preds, Tensor labels)
Parameters
Type | Name | Description |
---|---|---|
Tensor | preds | The predicted result. |
Tensor | labels | The true result. |
Returns
Type | Description |
---|---|
Tensor |
Overrides
| Improve this Doc View SourceForward(Tensor, Tensor)
Forwards the inputs and calculate the loss.
Declaration
public override Tensor Forward(Tensor preds, Tensor labels)
Parameters
Type | Name | Description |
---|---|---|
Tensor | preds | The predicted result. |
Tensor | labels | The true result. |
Returns
Type | Description |
---|---|
Tensor |