ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.05485
111
26
v1v2v3v4v5 (latest)

On the Properties of Kullback-Leibler Divergence Between Gaussians

10 February 2021
Yufeng Zhang
Wanwei Liu
Zhenbang Chen
Ji Wang
KenLi Li
ArXiv (abs)PDFHTML
Abstract

Kullback-Leibler (KL) divergence is one of the most important divergence measures between probability distributions. In this paper, we investigate the properties of KL divergence between Gaussians. Firstly, for any two nnn-dimensional Gaussians N1\mathcal{N}_1N1​ and N2\mathcal{N}_2N2​, we find the supremum of KL(N1∣∣N2)KL(\mathcal{N}_1||\mathcal{N}_2)KL(N1​∣∣N2​) when KL(N2∣∣N1)≤ϵKL(\mathcal{N}_2||\mathcal{N}_1)\leq \epsilonKL(N2​∣∣N1​)≤ϵ for ϵ>0\epsilon>0ϵ>0. This reveals the approximate symmetry of small KL divergence between Gaussians. We also find the infimum of KL(N1∣∣N2)KL(\mathcal{N}_1||\mathcal{N}_2)KL(N1​∣∣N2​) when KL(N2∣∣N1)≥MKL(\mathcal{N}_2||\mathcal{N}_1)\geq MKL(N2​∣∣N1​)≥M for M>0M>0M>0. Secondly, for any three nnn-dimensional Gaussians N1,N2\mathcal{N}_1, \mathcal{N}_2N1​,N2​ and N3\mathcal{N}_3N3​, we find a tight bound of KL(N1∣∣N3)KL(\mathcal{N}_1||\mathcal{N}_3)KL(N1​∣∣N3​) if KL(N1∣∣N2)KL(\mathcal{N}_1||\mathcal{N}_2)KL(N1​∣∣N2​) and KL(N2∣∣N3)KL(\mathcal{N}_2||\mathcal{N}_3)KL(N2​∣∣N3​) are bounded. This reveals that the KL divergence between Gaussians follows a relaxed triangle inequality. Importantly, all the bounds in the theorems presented in this paper are independent of the dimension nnn.

View on arXiv
Comments on this paper