ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.07596
  4. Cited By
Coordination without communication: optimal regret in two players
  multi-armed bandits

Coordination without communication: optimal regret in two players multi-armed bandits

14 February 2020
Sébastien Bubeck
Thomas Budzinski
ArXivPDFHTML

Papers citing "Coordination without communication: optimal regret in two players multi-armed bandits"

6 / 6 papers shown
Title
Communication-Efficient Collaborative Regret Minimization in Multi-Armed
  Bandits
Communication-Efficient Collaborative Regret Minimization in Multi-Armed Bandits
Nikolai Karpov
Qin Zhang
36
1
0
26 Jan 2023
A survey on multi-player bandits
A survey on multi-player bandits
Etienne Boursier
Vianney Perchet
32
13
0
29 Nov 2022
An Instance-Dependent Analysis for the Cooperative Multi-Player
  Multi-Armed Bandit
An Instance-Dependent Analysis for the Cooperative Multi-Player Multi-Armed Bandit
Aldo Pacchiano
Peter L. Bartlett
Michael I. Jordan
24
5
0
08 Nov 2021
Collaborative Pure Exploration in Kernel Bandit
Collaborative Pure Exploration in Kernel Bandit
Yihan Du
Wei Chen
Yuko Kuroki
Longbo Huang
45
10
0
29 Oct 2021
Heterogeneous Multi-player Multi-armed Bandits: Closing the Gap and
  Generalization
Heterogeneous Multi-player Multi-armed Bandits: Closing the Gap and Generalization
Chengshuai Shi
Wei Xiong
Cong Shen
Jing Yang
25
23
0
27 Oct 2021
On No-Sensing Adversarial Multi-player Multi-armed Bandits with
  Collision Communications
On No-Sensing Adversarial Multi-player Multi-armed Bandits with Collision Communications
Chengshuai Shi
Cong Shen
AAML
19
9
0
02 Nov 2020
1