CrossEntropyLoss

class getml.models.loss_functions.CrossEntropyLoss

Bases: getml.models.loss_functions._LossFunction

Cross entropy loss

The cross entropy between two probability distributions p(x) and q(x) is a combination of the information contained in p(x) and the additional information stored in q(x) with respect to p(x). In technical terms: it is the entropy of p(x) plus the Kullback-Leibler divergence - a distance in probability space - from q(x) to p(x).

H(p,q) = H(p) + D_{KL}(p||q)

For discrete probability distributions the cross entropy loss can be calculated by

H(p,q) = - \sum_{x \in X} p(x) \log q(x)

and for continuous probability distributions by

H(p,q) = - \int_{X} p(x) \log q(x) dx

with X being the support of the samples and p(x) and q(x) being two discrete or continuous probability distributions over X.

Note

Recommended loss function for classification problems.