ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.12586
38
1

Risk Bounds for Mixture Density Estimation on Compact Domains via the hhh-Lifted Kullback--Leibler Divergence

19 April 2024
Mark Chiu Chong
Hien Nguyen
TrungTin Nguyen
ArXiv (abs)PDFHTML
Abstract

We consider the problem of estimating probability density functions based on sample data, using a finite mixture of densities from some component class. To this end, we introduce the hhh-lifted Kullback--Leibler (KL) divergence as a generalization of the standard KL divergence and a criterion for conducting risk minimization. Under a compact support assumption, we prove an \mcO(1/n)\mc{O}(1/{\sqrt{n}})\mcO(1/n​) bound on the expected estimation error when using the hhh-lifted KL divergence, which extends the results of Rakhlin et al. (2005, ESAIM: Probability and Statistics, Vol. 9) and Li and Barron (1999, Advances in Neural Information ProcessingSystems, Vol. 12) to permit the risk bounding of density functions that are not strictly positive. We develop a procedure for the computation of the corresponding maximum hhh-lifted likelihood estimators (hhh-MLLEs) using the Majorization-Maximization framework and provide experimental results in support of our theoretical bounds.

View on arXiv
Comments on this paper