Density ratio estimation (DRE) is a core technique in machine learning used to capture relationships between two probability distributions. -divergence loss functions, which are derived from variational representations of -divergence, have become a standard choice in DRE for achieving cutting-edge performance. This study provides novel theoretical insights into DRE by deriving upper and lower bounds on the errors through -divergence loss functions. These bounds apply to any estimator belonging to a class of Lipschitz continuous estimators, irrespective of the specific -divergence loss function employed. The derived bounds are expressed as a product involving the data dimensionality and the expected value of the density ratio raised to the -th power. Notably, the lower bound includes an exponential term that depends on the Kullback--Leibler (KL) divergence, revealing that the error increases significantly as the KL divergence grows when . This increase becomes even more pronounced as the value of grows. The theoretical insights are validated through numerical experiments.
View on arXiv@article{kitazawa2025_2410.01516, title={ Bounds on Lp errors in density ratio estimation via f-divergence loss functions }, author={ Yoshiaki Kitazawa }, journal={arXiv preprint arXiv:2410.01516}, year={ 2025 } }