111
26

On the Properties of Kullback-Leibler Divergence Between Gaussians

Abstract

Kullback-Leibler (KL) divergence is one of the most important divergence measures between probability distributions. In this paper, we investigate the properties of KL divergence between Gaussians. Firstly, for any two nn-dimensional Gaussians N1\mathcal{N}_1 and N2\mathcal{N}_2, we find the supremum of KL(N1N2)KL(\mathcal{N}_1||\mathcal{N}_2) when KL(N2N1)ϵKL(\mathcal{N}_2||\mathcal{N}_1)\leq \epsilon for ϵ>0\epsilon>0. This reveals the approximate symmetry of small KL divergence between Gaussians. We also find the infimum of KL(N1N2)KL(\mathcal{N}_1||\mathcal{N}_2) when KL(N2N1)MKL(\mathcal{N}_2||\mathcal{N}_1)\geq M for M>0M>0. Secondly, for any three nn-dimensional Gaussians N1,N2\mathcal{N}_1, \mathcal{N}_2 and N3\mathcal{N}_3, we find a bound of KL(N1N3)KL(\mathcal{N}_1||\mathcal{N}_3) if KL(N1N2)KL(\mathcal{N}_1||\mathcal{N}_2) and KL(N2N3)KL(\mathcal{N}_2||\mathcal{N}_3) are bounded. This reveals that the KL divergence between Gaussians follows a relaxed triangle inequality. Importantly, all the bounds in the theorems presented in this paper are independent of the dimension nn.

View on arXiv
Comments on this paper