27
0

Optimizing Shortfall Risk Metric for Learning Regression Models

Abstract

We consider the problem of estimating and optimizing utility-based shortfall risk (UBSR) of a loss, say (YY^)2(Y - \hat Y)^2, in the context of a regression problem. Empirical risk minimization with a UBSR objective is challenging since UBSR is a non-linear function of the underlying distribution. We first derive a concentration bound for UBSR estimation using independent and identically distributed (i.i.d.) samples. We then frame the UBSR optimization problem as minimization of a pseudo-linear function in the space of achievable distributions D\mathcal D of the loss (YY^)2(Y- \hat Y)^2. We construct a gradient oracle for the UBSR objective and a linear minimization oracle (LMO) for the set D\mathcal D. Using these oracles, we devise a bisection-type algorithm, and establish convergence to the UBSR-optimal solution.

View on arXiv
@article{ramaswamy2025_2505.17777,
  title={ Optimizing Shortfall Risk Metric for Learning Regression Models },
  author={ Harish G. Ramaswamy and L.A. Prashanth },
  journal={arXiv preprint arXiv:2505.17777},
  year={ 2025 }
}
Comments on this paper