ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.13760
7
0

Consistency Conditions for Differentiable Surrogate Losses

19 May 2025
Drona Khurana
Anish Thilagar
Dhamma Kimpara
Rafael Frongillo
ArXivPDFHTML
Abstract

The statistical consistency of surrogate losses for discrete prediction tasks is often checked via the condition of calibration. However, directly verifying calibration can be arduous. Recent work shows that for polyhedral surrogates, a less arduous condition, indirect elicitation (IE), is still equivalent to calibration. We give the first results of this type for non-polyhedral surrogates, specifically the class of convex differentiable losses. We first prove that under mild conditions, IE and calibration are equivalent for one-dimensional losses in this class. We construct a counter-example that shows that this equivalence fails in higher dimensions. This motivates the introduction of strong IE, a strengthened form of IE that is equally easy to verify. We establish that strong IE implies calibration for differentiable surrogates and is both necessary and sufficient for strongly convex, differentiable surrogates. Finally, we apply these results to a range of problems to demonstrate the power of IE and strong IE for designing and analyzing consistent differentiable surrogates.

View on arXiv
@article{khurana2025_2505.13760,
  title={ Consistency Conditions for Differentiable Surrogate Losses },
  author={ Drona Khurana and Anish Thilagar and Dhamma Kimpara and Rafael Frongillo },
  journal={arXiv preprint arXiv:2505.13760},
  year={ 2025 }
}
Comments on this paper