ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.07784
19
30

k-means++: few more steps yield constant approximation

18 February 2020
Davin Choo
Christoph Grunau
Julian Portmann
Václav Rozhon
ArXivPDFHTML
Abstract

The k-means++ algorithm of Arthur and Vassilvitskii (SODA 2007) is a state-of-the-art algorithm for solving the k-means clustering problem and is known to give an O(log k)-approximation in expectation. Recently, Lattanzi and Sohler (ICML 2019) proposed augmenting k-means++ with O(k log log k) local search steps to yield a constant approximation (in expectation) to the k-means clustering problem. In this paper, we improve their analysis to show that, for any arbitrarily small constant \eps>0\eps > 0\eps>0, with only \epsk\eps k\epsk additional local search steps, one can achieve a constant approximation guarantee (with high probability in k), resolving an open problem in their paper.

View on arXiv
Comments on this paper