ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.06980
8
42

Robust Sub-Gaussian Principal Component Analysis and Width-Independent Schatten Packing

12 June 2020
A. Jambulapati
Jingkai Li
Kevin Tian
ArXivPDFHTML
Abstract

We develop two methods for the following fundamental statistical task: given an ϵ\epsilonϵ-corrupted set of nnn samples from a ddd-dimensional sub-Gaussian distribution, return an approximate top eigenvector of the covariance matrix. Our first robust PCA algorithm runs in polynomial time, returns a 1−O(ϵlog⁡ϵ−1)1 - O(\epsilon\log\epsilon^{-1})1−O(ϵlogϵ−1)-approximate top eigenvector, and is based on a simple iterative filtering approach. Our second, which attains a slightly worse approximation factor, runs in nearly-linear time and sample complexity under a mild spectral gap assumption. These are the first polynomial-time algorithms yielding non-trivial information about the covariance of a corrupted sub-Gaussian distribution without requiring additional algebraic structure of moments. As a key technical tool, we develop the first width-independent solvers for Schatten-ppp norm packing semidefinite programs, giving a (1+ϵ)(1 + \epsilon)(1+ϵ)-approximate solution in O(plog⁡(ndϵ)ϵ−1)O(p\log(\tfrac{nd}{\epsilon})\epsilon^{-1})O(plog(ϵnd​)ϵ−1) input-sparsity time iterations (where nnn, ddd are problem dimensions).

View on arXiv
Comments on this paper