5
0

FedOne: Query-Efficient Federated Learning for Black-box Discrete Prompt Learning

Ganyu Wang
Jinjie Fang
Maxwell J. Ying
Bin Gu
Xi Chen
Boyu Wang
Charles Ling
Main:8 Pages
6 Figures
Bibliography:4 Pages
9 Tables
Appendix:21 Pages
Abstract

Black-Box Discrete Prompt Learning is a prompt-tuning method that optimizes discrete prompts without accessing model parameters or gradients, making the prompt tuning on a cloud-based Large Language Model (LLM) feasible. Adapting federated learning to BDPL could further enhance prompt tuning performance by leveraging data from diverse sources. However, all previous research on federated black-box prompt tuning had neglected the substantial query cost associated with the cloud-based LLM service. To address this gap, we conducted a theoretical analysis of query efficiency within the context of federated black-box prompt tuning. Our findings revealed that degrading FedAvg to activate only one client per round, a strategy we called \textit{FedOne}, enabled optimal query efficiency in federated black-box prompt learning. Building on this insight, we proposed the FedOne framework, a federated black-box discrete prompt learning method designed to maximize query efficiency when interacting with cloud-based LLMs. We conducted numerical experiments on various aspects of our framework, demonstrating a significant improvement in query efficiency, which aligns with our theoretical results.

View on arXiv
@article{wang2025_2506.14929,
  title={ FedOne: Query-Efficient Federated Learning for Black-box Discrete Prompt Learning },
  author={ Ganyu Wang and Jinjie Fang and Maxwell J. Ying and Bin Gu and Xi Chen and Boyu Wang and Charles Ling },
  journal={arXiv preprint arXiv:2506.14929},
  year={ 2025 }
}
Comments on this paper