ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2005.02970
27
31

Outlier-Robust Clustering of Non-Spherical Mixtures

6 May 2020
Ainesh Bakshi
Pravesh Kothari
ArXivPDFHTML
Abstract

We give the first outlier-robust efficient algorithm for clustering a mixture of kkk statistically separated d-dimensional Gaussians (k-GMMs). Concretely, our algorithm takes input an ϵ\epsilonϵ-corrupted sample from a kkk-GMM and whp in dpoly(k/η)d^{\text{poly}(k/\eta)}dpoly(k/η) time, outputs an approximate clustering that misclassifies at most kO(k)(ϵ+η)k^{O(k)}(\epsilon+\eta)kO(k)(ϵ+η) fraction of the points whenever every pair of mixture components are separated by 1−exp⁡(−poly(k/η)k)1-\exp(-\text{poly}(k/\eta)^k)1−exp(−poly(k/η)k) in total variation (TV) distance. Such a result was not previously known even for k=2k=2k=2. TV separation is the statistically weakest possible notion of separation and captures important special cases such as mixed linear regression and subspace clustering. Our main conceptual contribution is to distill simple analytic properties - (certifiable) hypercontractivity and bounded variance of degree 2 polynomials and anti-concentration of linear projections - that are necessary and sufficient for mixture models to be (efficiently) clusterable. As a consequence, our results extend to clustering mixtures of arbitrary affine transforms of the uniform distribution on the ddd-dimensional unit sphere. Even the information-theoretic clusterability of separated distributions satisfying these two analytic assumptions was not known prior to our work and is likely to be of independent interest. Our algorithms build on the recent sequence of works relying on certifiable anti-concentration first introduced in the works of Karmarkar, Klivans, and Kothari and Raghavendra, and Yau in 2019. Our techniques expand the sum-of-squares toolkit to show robust certifiability of TV-separated Gaussian clusters in data. This involves giving a low-degree sum-of-squares proof of statements that relate parameter (i.e. mean and covariances) distance to total variation distance by relying only on hypercontractivity and anti-concentration.

View on arXiv
Comments on this paper