Kullback-Leibler (KL) divergence is one of the most important divergence measures between probability distributions. In this paper, we investigate the properties of KL divergence between Gaussians. Firstly, for any two -dimensional Gaussians and , we find the supremum of when for . This reveals the approximate symmetry of small KL divergence between Gaussians. We also find the infimum of when for . Secondly, for any three -dimensional Gaussians and , we find a tight bound of if and are bounded. This reveals that the KL divergence between Gaussians follows a relaxed triangle inequality. Importantly, all the bounds in the theorems presented in this paper are independent of the dimension .
View on arXiv