22
16

Quantum Algorithms and Lower Bounds for Linear Regression with Norm Constraints

Abstract

Lasso and Ridge are important minimization problems in machine learning and statistics. They are versions of linear regression with squared loss where the vector θRd\theta\in\mathbb{R}^d of coefficients is constrained in either 1\ell_1-norm (for Lasso) or in 2\ell_2-norm (for Ridge). We study the complexity of quantum algorithms for finding ε\varepsilon-minimizers for these minimization problems. We show that for Lasso we can get a quadratic quantum speedup in terms of dd by speeding up the cost-per-iteration of the Frank-Wolfe algorithm, while for Ridge the best quantum algorithms are linear in dd, as are the best classical algorithms. As a byproduct of our quantum lower bound for Lasso, we also prove the first classical lower bound for Lasso that is tight up to polylog-factors.

View on arXiv
Comments on this paper