14
59

Better Agnostic Clustering Via Relaxed Tensor Norms

Abstract

We develop a new family of convex relaxations for kk-means clustering based on sum-of-squares norms, a relaxation of the injective tensor norm that is efficiently computable using the Sum-of-Squares algorithm. We give an algorithm based on this relaxation that recovers a faithful approximation to the true means in the given data whenever the low-degree moments of the points in each cluster have bounded sum-of-squares norms. We then prove a sharp upper bound on the sum-of-squares norms for moment tensors of any distribution that satisfies the \emph{Poincare inequality}. The Poincare inequality is a central inequality in probability theory, and a large class of distributions satisfy it including Gaussians, product distributions, strongly log-concave distributions, and any sum or uniformly continuous transformation of such distributions. As an immediate corollary, for any γ>0\gamma > 0, we obtain an efficient algorithm for learning the means of a mixture of kk arbitrary \Poincare distributions in Rd\mathbb{R}^d in time dO(1/γ)d^{O(1/\gamma)} so long as the means have separation Ω(kγ)\Omega(k^{\gamma}). This in particular yields an algorithm for learning Gaussian mixtures with separation Ω(kγ)\Omega(k^{\gamma}), thus partially resolving an open problem of Regev and Vijayaraghavan \citet{regev2017learning}. Our algorithm works even in the outlier-robust setting where an ϵ\epsilon fraction of arbitrary outliers are added to the data, as long as the fraction of outliers is smaller than the smallest cluster. We, therefore, obtain results in the strong agnostic setting where, in addition to not knowing the distribution family, the data itself may be arbitrarily corrupted.

View on arXiv
Comments on this paper