ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1405.7910
21
186

Optimal CUR Matrix Decompositions

30 May 2014
Christos Boutsidis
David P. Woodruff
ArXivPDFHTML
Abstract

The CUR decomposition of an m×nm \times nm×n matrix AAA finds an m×cm \times cm×c matrix CCC with a subset of c<nc < nc<n columns of A,A,A, together with an r×nr \times nr×n matrix RRR with a subset of r<mr < mr<m rows of A,A,A, as well as a c×rc \times rc×r low-rank matrix UUU such that the matrix CURC U RCUR approximates the matrix A,A,A, that is, ∣∣A−CUR∣∣F2≤(1+ϵ)∣∣A−Ak∣∣F2 || A - CUR ||_F^2 \le (1+\epsilon) || A - A_k||_F^2∣∣A−CUR∣∣F2​≤(1+ϵ)∣∣A−Ak​∣∣F2​, where ∣∣.∣∣F||.||_F∣∣.∣∣F​ denotes the Frobenius norm and AkA_kAk​ is the best m×nm \times nm×n matrix of rank kkk constructed via the SVD. We present input-sparsity-time and deterministic algorithms for constructing such a CUR decomposition where c=O(k/ϵ)c=O(k/\epsilon)c=O(k/ϵ) and r=O(k/ϵ)r=O(k/\epsilon)r=O(k/ϵ) and rank(U)=k(U) = k(U)=k. Up to constant factors, our algorithms are simultaneously optimal in c,r,c, r,c,r, and rank(U)(U)(U).

View on arXiv
Comments on this paper