We study Empirical Risk Minimizers (ERM) and Regularized Empirical Risk Minimizers (RERM) for regression problems with convex and -Lipschitz loss functions. We consider a setting where malicious outliers contaminate the labels. In that case, under a local Bernstein condition, we show that the -error rate is bounded by , where is the total number of observations, is the -error rate in the non-contaminated setting and is a parameter coming from the local Bernstein condition. When is minimax-rate-optimal in a non-contaminated setting, the rate is also minimax-rate-optimal when outliers contaminate the label. The main results of the paper can be used for many non-regularized and regularized procedures under weak assumptions on the noise. We present results for Huber's M-estimators (without penalization or regularized by the -norm) and for general regularized learning problems in reproducible kernel Hilbert spaces when the noise can be heavy-tailed.
View on arXiv