ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.00037
24
4

Differentially-Private Hierarchical Clustering with Provable Approximation Guarantees

31 January 2023
Jacob Imola
Alessandro Epasto
Mohammad Mahdian
Vincent Cohen-Addad
Vahab Mirrokni
ArXivPDFHTML
Abstract

Hierarchical Clustering is a popular unsupervised machine learning method with decades of history and numerous applications. We initiate the study of differentially private approximation algorithms for hierarchical clustering under the rigorous framework introduced by (Dasgupta, 2016). We show strong lower bounds for the problem: that any ϵ\epsilonϵ-DP algorithm must exhibit O(∣V∣2/ϵ)O(|V|^2/ \epsilon)O(∣V∣2/ϵ)-additive error for an input dataset VVV. Then, we exhibit a polynomial-time approximation algorithm with O(∣V∣2.5/ϵ)O(|V|^{2.5}/ \epsilon)O(∣V∣2.5/ϵ)-additive error, and an exponential-time algorithm that meets the lower bound. To overcome the lower bound, we focus on the stochastic block model, a popular model of graphs, and, with a separation assumption on the blocks, propose a private 1+o(1)1+o(1)1+o(1) approximation algorithm which also recovers the blocks exactly. Finally, we perform an empirical study of our algorithms and validate their performance.

View on arXiv
Comments on this paper