ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.06651
39
0

Differentially Private Empirical Cumulative Distribution Functions

10 February 2025
Antoine Barczewski
Amal Mawass
Jan Ramon
    FedML
ArXivPDFHTML
Abstract

In order to both learn and protect sensitive training data, there has been a growing interest in privacy preserving machine learning methods. Differential privacy has emerged as an important measure of privacy. We are interested in the federated setting where a group of parties each have one or more training instances and want to learn collaboratively without revealing their data.In this paper, we propose strategies to compute differentially private empirical distribution functions. While revealing complete functions is more expensive from the point of view of privacy budget, it may also provide richer and more valuable information to the learner. We prove privacy guarantees and discuss the computational cost, both for a generic strategy fitting any security model and a special-purpose strategy based on secret sharing. We survey a number of applications and present experiments.

View on arXiv
@article{barczewski2025_2502.06651,
  title={ Differentially Private Empirical Cumulative Distribution Functions },
  author={ Antoine Barczewski and Amal Mawass and Jan Ramon },
  journal={arXiv preprint arXiv:2502.06651},
  year={ 2025 }
}
Comments on this paper