Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1811.07763
Cited By
Decentralized Exploration in Multi-Armed Bandits -- Extended version
19 November 2018
Raphael Feraud
Réda Alami
Romain Laroche
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Decentralized Exploration in Multi-Armed Bandits -- Extended version"
2 / 2 papers shown
Title
Cooperative Multi-agent Bandits: Distributed Algorithms with Optimal Individual Regret and Constant Communication Costs
L. Yang
Xuchuang Wang
Mohammad Hajiesmaili
Lijun Zhang
John C. S. Lui
Don Towsley
41
5
0
08 Aug 2023
Online Learning for Cooperative Multi-Player Multi-Armed Bandits
William Chang
Mehdi Jafarnia-Jahromi
Rahul Jain
31
7
0
07 Sep 2021
1