37
1

Sample and Computationally Efficient Robust Learning of Gaussian Single-Index Models

Abstract

A single-index model (SIM) is a function of the form σ(wx)\sigma(\mathbf{w}^{\ast} \cdot \mathbf{x}), where σ:RR\sigma: \mathbb{R} \to \mathbb{R} is a known link function and w\mathbf{w}^{\ast} is a hidden unit vector. We study the task of learning SIMs in the agnostic (a.k.a. adversarial label noise) model with respect to the L22L^2_2-loss under the Gaussian distribution. Our main result is a sample and computationally efficient agnostic proper learner that attains L22L^2_2-error of O(OPT)+ϵO(\mathrm{OPT})+\epsilon, where OPT\mathrm{OPT} is the optimal loss. The sample complexity of our algorithm is O~(dk/2+d/ϵ)\tilde{O}(d^{\lceil k^{\ast}/2\rceil}+d/\epsilon), where kk^{\ast} is the information-exponent of σ\sigma corresponding to the degree of its first non-zero Hermite coefficient. This sample bound nearly matches known CSQ lower bounds, even in the realizable setting. Prior algorithmic work in this setting had focused on learning in the realizable case or in the presence of semi-random noise. Prior computationally efficient robust learners required significantly stronger assumptions on the link function.

View on arXiv
Comments on this paper