30
4
v1v2 (latest)

Finite sample rates for logistic regression with small noise or few samples

Abstract

The logistic regression estimator is known to inflate the magnitude of its coefficients if the sample size nn is small, the dimension pp is (moderately) large or the signal-to-noise ratio 1/σ1/\sigma is large (probabilities of observing a label are close to 0 or 1). With this in mind, we study the logistic regression estimator with pn/lognp\ll n/\log n, assuming Gaussian covariates and labels generated by the Gaussian link function, with a mild optimization constraint on the estimator's length to ensure existence. We provide finite sample guarantees for its direction, which serves as a classifier, and its Euclidean norm, which is an estimator for the signal-to-noise ratio. We distinguish between two regimes. In the low-noise/small-sample regime (σ(plogn)/n\sigma\lesssim (p\log n)/n), we show that the estimator's direction (and consequentially the classification error) achieve the rate (plogn)/n(p\log n)/n - up to the log term as if the problem was noiseless. In this case, the norm of the estimator is at least of order n/(plogn)n/(p\log n). If instead (plogn)/nσ1(p\log n)/n\lesssim \sigma\lesssim 1, the estimator's direction achieves the rate σplogn/n\sqrt{\sigma p\log n/n}, whereas its norm converges to the true norm at the rate plogn/(nσ3)\sqrt{p\log n/(n\sigma^3)}. As a corollary, the data are not linearly separable with high probability in this regime. In either regime, logistic regression provides a competitive classifier.

View on arXiv
Comments on this paper