ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1905.11067
23
0

Locally Differentially Private Minimum Finding

27 May 2019
Kazuto Fukuchi
Chia-Mu Yu
Arashi Haishima
Jun Sakuma
    FedML
ArXivPDFHTML
Abstract

We investigate a problem of finding the minimum, in which each user has a real value and we want to estimate the minimum of these values under the local differential privacy constraint. We reveal that this problem is fundamentally difficult, and we cannot construct a mechanism that is consistent in the worst case. Instead of considering the worst case, we aim to construct a private mechanism whose error rate is adaptive to the easiness of estimation of the minimum. As a measure of easiness, we introduce a parameter α\alphaα that characterizes the fatness of the minimum-side tail of the user data distribution. As a result, we reveal that the mechanism can achieve O((ln⁡6N/ϵ2N)1/2α)O((\ln^6N/\epsilon^2N)^{1/2\alpha})O((ln6N/ϵ2N)1/2α) error without knowledge of α\alphaα and the error rate is near-optimal in the sense that any mechanism incurs Ω((1/ϵ2N)1/2α)\Omega((1/\epsilon^2N)^{1/2\alpha})Ω((1/ϵ2N)1/2α) error. Furthermore, we demonstrate that our mechanism outperforms a naive mechanism by empirical evaluations on synthetic datasets. Also, we conducted experiments on the MovieLens dataset and a purchase history dataset and demonstrate that our algorithm achieves O~((1/N)1/2α)\tilde{O}((1/N)^{1/2\alpha})O~((1/N)1/2α) error adaptively to α\alphaα.

View on arXiv
Comments on this paper