ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1611.03225
30
58

Sharper Bounds for Regularized Data Fitting

10 November 2016
H. Avron
K. Clarkson
David P. Woodruff
ArXivPDFHTML
Abstract

We study matrix sketching methods for regularized variants of linear regression, low rank approximation, and canonical correlation analysis. Our main focus is on sketching techniques which preserve the objective function value for regularized problems, which is an area that has remained largely unexplored. We study regularization both in a fairly broad setting, and in the specific context of the popular and widely used technique of ridge regularization; for the latter, as applied to each of these problems, we show algorithmic resource bounds in which the {\em statistical dimension} appears in places where in previous bounds the rank would appear. The statistical dimension is always smaller than the rank, and decreases as the amount of regularization increases. In particular, for the ridge low-rank approximation problem min⁡Y,X∥YX−A∥F2+λ∥Y∥F2+λ∥X∥F2\min_{Y,X} \lVert YX - A \rVert_F^2 + \lambda \lVert Y\rVert_F^2 + \lambda\lVert X \rVert_F^2minY,X​∥YX−A∥F2​+λ∥Y∥F2​+λ∥X∥F2​, where Y∈Rn×kY\in\mathbb{R}^{n\times k}Y∈Rn×k and X∈Rk×dX\in\mathbb{R}^{k\times d}X∈Rk×d, we give an approximation algorithm needing \[ O(\mathtt{nnz}(A)) + \tilde{O}((n+d)\varepsilon^{-1}k \min\{k, \varepsilon^{-1}\mathtt{sd}_\lambda(Y^*)\})+ \mathtt{poly}(\mathtt{sd}_\lambda(Y^*) \varepsilon^{-1}) \] time, where sλ(Y∗)≤ks_{\lambda}(Y^*)\le ksλ​(Y∗)≤k is the statistical dimension of Y∗Y^*Y∗, Y∗Y^*Y∗ is an optimal YYY, ε\varepsilonε is an error parameter, and nnz(A)\mathtt{nnz}(A)nnz(A) is the number of nonzero entries of AAA.This is faster than prior work, even when λ=0\lambda=0λ=0. We also study regularization in a much more general setting. For example, we obtain sketching-based algorithms for the low-rank approximation problem min⁡X,Y∥YX−A∥F2+f(Y,X)\min_{X,Y} \lVert YX - A \rVert_F^2 + f(Y,X)minX,Y​∥YX−A∥F2​+f(Y,X) where f(⋅,⋅)f(\cdot,\cdot)f(⋅,⋅) is a regularizing function satisfying some very general conditions (chiefly, invariance under orthogonal transformations).

View on arXiv
Comments on this paper