ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.03454
85
0
v1v2 (latest)

Data Poisoning Attacks to Locally Differentially Private Range Query Protocols

5 March 2025
Ting-Wei Liao
Chih-Hsun Lin
Yu-Lin Tsai
Takao Murakami
Chia-Mu Yu
    AAML
ArXiv (abs)PDFHTML
Abstract

Trajectory data, which tracks movements through geographic locations, is crucial for improving real-world applications. However, collecting such sensitive data raises considerable privacy concerns. Local differential privacy (LDP) offers a solution by allowing individuals to locally perturb their trajectory data before sharing it. Despite its privacy benefits, LDP protocols are vulnerable to data poisoning attacks, where attackers inject fake data to manipulate aggregated results. In this work, we make the first attempt to analyze vulnerabilities in several representative LDP trajectory protocols. We propose \textsc{TraP}, a heuristic algorithm for data \underline{P}oisoning attacks using a prefix-suffix method to optimize fake \underline{Tra}jectory selection, significantly reducing computational complexity. Our experimental results demonstrate that our attack can substantially increase target pattern occurrences in the perturbed trajectory dataset with few fake users. This study underscores the urgent need for robust defenses and better protocol designs to safeguard LDP trajectory data against malicious manipulation.

View on arXiv
@article{liao2025_2503.03454,
  title={ Data Poisoning Attacks to Locally Differentially Private Range Query Protocols },
  author={ Ting-Wei Liao and Chih-Hsun Lin and Yu-Lin Tsai and Takao Murakami and Chia-Mu Yu and Jun Sakuma and Chun-Ying Huang and Hiroaki Kikuchi },
  journal={arXiv preprint arXiv:2503.03454},
  year={ 2025 }
}
Comments on this paper