ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.08899
14
0

Bounding Neyman-Pearson Region with fff-Divergences

13 May 2025
Andrew Mullhaupt
Cheng Peng
ArXivPDFHTML
Abstract

The Neyman-Pearson region of a simple binary hypothesis testing is the set of points whose coordinates represent the false positive rate and false negative rate of some test. The lower boundary of this region is given by the Neyman-Pearson lemma, and is up to a coordinate change, equivalent to the optimal ROC curve. We establish a novel lower bound for the boundary in terms of any fff-divergence. Since the bound generated by hockey-stick fff-divergences characterizes the Neyman-Pearson boundary, this bound is best possible. In the case of KL divergence, this bound improves Pinsker's inequality. Furthermore, we obtain a closed-form refined upper bound for the Neyman-Pearson boundary in terms of the Chernoff α\alphaα-coefficient. Finally, we present methods for constructing pairs of distributions that can approximately or exactly realize any given Neyman-Pearson boundary.

View on arXiv
@article{mullhaupt2025_2505.08899,
  title={ Bounding Neyman-Pearson Region with $f$-Divergences },
  author={ Andrew Mullhaupt and Cheng Peng },
  journal={arXiv preprint arXiv:2505.08899},
  year={ 2025 }
}
Comments on this paper