ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.14622
  4. Cited By
Heterogeneous Multi-player Multi-armed Bandits: Closing the Gap and
  Generalization

Heterogeneous Multi-player Multi-armed Bandits: Closing the Gap and Generalization

27 October 2021
Chengshuai Shi
Wei Xiong
Cong Shen
Jing Yang
ArXivPDFHTML

Papers citing "Heterogeneous Multi-player Multi-armed Bandits: Closing the Gap and Generalization"

4 / 4 papers shown
Title
Harnessing the Power of Federated Learning in Federated Contextual
  Bandits
Harnessing the Power of Federated Learning in Federated Contextual Bandits
Chengshuai Shi
Ruida Zhou
Kun Yang
Cong Shen
FedML
23
0
0
26 Dec 2023
Cooperative Multi-agent Bandits: Distributed Algorithms with Optimal
  Individual Regret and Constant Communication Costs
Cooperative Multi-agent Bandits: Distributed Algorithms with Optimal Individual Regret and Constant Communication Costs
L. Yang
Xuchuang Wang
Mohammad Hajiesmaili
Lijun Zhang
John C. S. Lui
Don Towsley
36
5
0
08 Aug 2023
A survey on multi-player bandits
A survey on multi-player bandits
Etienne Boursier
Vianney Perchet
32
13
0
29 Nov 2022
Matroid Bandits: Fast Combinatorial Optimization with Learning
Matroid Bandits: Fast Combinatorial Optimization with Learning
B. Kveton
Zheng Wen
Azin Ashkan
Hoda Eydgahi
Brian Eriksson
46
119
0
20 Mar 2014
1