ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.01516
16
0

Bounds on Lp errors in density ratio estimation via f-divergence loss functions

2 October 2024
Yoshiaki Kitazawa
ArXivPDFHTML
Abstract

Density ratio estimation (DRE) is a core technique in machine learning used to capture relationships between two probability distributions. fff-divergence loss functions, which are derived from variational representations of fff-divergence, have become a standard choice in DRE for achieving cutting-edge performance. This study provides novel theoretical insights into DRE by deriving upper and lower bounds on the LpL_pLp​ errors through fff-divergence loss functions. These bounds apply to any estimator belonging to a class of Lipschitz continuous estimators, irrespective of the specific fff-divergence loss function employed. The derived bounds are expressed as a product involving the data dimensionality and the expected value of the density ratio raised to the ppp-th power. Notably, the lower bound includes an exponential term that depends on the Kullback--Leibler (KL) divergence, revealing that the LpL_pLp​ error increases significantly as the KL divergence grows when p>1p > 1p>1. This increase becomes even more pronounced as the value of ppp grows. The theoretical insights are validated through numerical experiments.

View on arXiv
@article{kitazawa2025_2410.01516,
  title={ Bounds on Lp errors in density ratio estimation via f-divergence loss functions },
  author={ Yoshiaki Kitazawa },
  journal={arXiv preprint arXiv:2410.01516},
  year={ 2025 }
}
Comments on this paper