Optimizing Shortfall Risk Metric for Learning Regression Models

We consider the problem of estimating and optimizing utility-based shortfall risk (UBSR) of a loss, say , in the context of a regression problem. Empirical risk minimization with a UBSR objective is challenging since UBSR is a non-linear function of the underlying distribution. We first derive a concentration bound for UBSR estimation using independent and identically distributed (i.i.d.) samples. We then frame the UBSR optimization problem as minimization of a pseudo-linear function in the space of achievable distributions of the loss . We construct a gradient oracle for the UBSR objective and a linear minimization oracle (LMO) for the set . Using these oracles, we devise a bisection-type algorithm, and establish convergence to the UBSR-optimal solution.
View on arXiv@article{ramaswamy2025_2505.17777, title={ Optimizing Shortfall Risk Metric for Learning Regression Models }, author={ Harish G. Ramaswamy and L.A. Prashanth }, journal={arXiv preprint arXiv:2505.17777}, year={ 2025 } }