32
5

Finite sample rates for logistic regression with small noise or few samples

Abstract

The logistic regression estimator is known to inflate the magnitude of its coefficients if the sample size nn is small, the dimension pp is (moderately) large or the signal-to-noise ratio 1/σ1/\sigma is large (probabilities of observing a label are close to 0 or 1). With this in mind, we study the logistic regression estimator with pn/lognp\ll n/\log n, assuming Gaussian covariates and labels generated by the Gaussian link function, with a mild optimization constraint on the estimator's length to ensure existence. We provide finite sample guarantees for its direction, which serves as a classifier, and its Euclidean norm, which is an estimator for the signal-to-noise ratio. We distinguish between two regimes. In the low-noise/small-sample regime (nσplognn\sigma\lesssim p\log n), we show that the estimator's direction (and consequentially the classification error) achieve the rate (plogn)/n(p\log n)/n - as if the problem was noiseless. In this case, the norm of the estimator is at least of order n/(plogn)n/(p\log n). If instead nσplognn\sigma\gtrsim p\log n, the estimator's direction achieves the rate σplogn/n\sqrt{\sigma p\log n/n}, whereas its norm converges to the true norm at the rate plogn/(nσ3)\sqrt{p\log n/(n\sigma^3)}. As a corollary, the data are not linearly separable with high probability in this regime. The logistic regression estimator allows to conclude which regime occurs with high probability. Therefore, inference for logistic regression is possible in the regime nσplognn\sigma\gtrsim p\log n. In either case, logistic regression provides a competitive classifier.

View on arXiv
Comments on this paper