ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2208.14555
24
5

Dynamic Global Sensitivity for Differentially Private Contextual Bandits

30 August 2022
Huazheng Wang
Dave Zhao
Hongning Wang
ArXivPDFHTML
Abstract

Bandit algorithms have become a reference solution for interactive recommendation. However, as such algorithms directly interact with users for improved recommendations, serious privacy concerns have been raised regarding its practical use. In this work, we propose a differentially private linear contextual bandit algorithm, via a tree-based mechanism to add Laplace or Gaussian noise to model parameters. Our key insight is that as the model converges during online update, the global sensitivity of its parameters shrinks over time (thus named dynamic global sensitivity). Compared with existing solutions, our dynamic global sensitivity analysis allows us to inject less noise to obtain (ϵ,δ)(\epsilon, \delta)(ϵ,δ)-differential privacy with added regret caused by noise injection in O~(log⁡TT/ϵ)\tilde O(\log{T}\sqrt{T}/\epsilon)O~(logTT​/ϵ). We provide a rigorous theoretical analysis over the amount of noise added via dynamic global sensitivity and the corresponding upper regret bound of our proposed algorithm. Experimental results on both synthetic and real-world datasets confirmed the algorithm's advantage against existing solutions.

View on arXiv
Comments on this paper