ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.19480
31
9

HHH-Consistency Guarantees for Regression

28 March 2024
Anqi Mao
M. Mohri
Yutao Zhong
ArXivPDFHTML
Abstract

We present a detailed study of HHH-consistency bounds for regression. We first present new theorems that generalize the tools previously given to establish HHH-consistency bounds. This generalization proves essential for analyzing HHH-consistency bounds specific to regression. Next, we prove a series of novel HHH-consistency bounds for surrogate loss functions of the squared loss, under the assumption of a symmetric distribution and a bounded hypothesis set. This includes positive results for the Huber loss, all ℓp\ell_pℓp​ losses, p≥1p \geq 1p≥1, the squared ϵ\epsilonϵ-insensitive loss, as well as a negative result for the ϵ\epsilonϵ-insensitive loss used in squared Support Vector Regression (SVR). We further leverage our analysis of HHH-consistency for regression and derive principled surrogate losses for adversarial regression (Section 5). This readily establishes novel algorithms for adversarial regression, for which we report favorable experimental results in Section 6.

View on arXiv
Comments on this paper