BayesianMachineLearningInformationTheoryMachineLearningStatistics
The relative entropy or KL divergence between two distributions
- Non-negative:
- Definite:
but it does not define a metric as it is - Asymmetric:
for - Does not obey the triangle inequality
- Not a Bregman Divergence
The unnormalized relative entropy is a Bregman Divergence and is given by