ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1802.07917
  4. Cited By
Regional Multi-Armed Bandits

Regional Multi-Armed Bandits

22 February 2018
Zhiyang Wang
Ruida Zhou
Cong Shen
ArXivPDFHTML

Papers citing "Regional Multi-Armed Bandits"

9 / 9 papers shown
Title
Causally Abstracted Multi-armed Bandits
Causally Abstracted Multi-armed Bandits
Fabio Massimo Zennaro
Nicholas Bishop
Joel Dyer
Yorgos Felekis
Anisoara Calinescu
Michael Wooldridge
Theodoros Damoulas
38
2
0
26 Apr 2024
Evaluating COVID-19 vaccine allocation policies using Bayesian $m$-top exploration
Evaluating COVID-19 vaccine allocation policies using Bayesian mmm-top exploration
Alexandra Cimpean
T. Verstraeten
L. Willem
N. Hens
Ann Nowé
Pieter J. K. Libin
21
2
0
30 Jan 2023
TSEC: a framework for online experimentation under experimental
  constraints
TSEC: a framework for online experimentation under experimental constraints
Simon Mak
Yuanshuo Zhou
Lavonne Hoang
C. F. J. Wu
18
2
0
17 Jan 2021
Multi-Armed Bandits with Dependent Arms
Multi-Armed Bandits with Dependent Arms
Rahul Singh
Fang Liu
Yin Sun
Ness B. Shroff
19
11
0
13 Oct 2020
Carousel Personalization in Music Streaming Apps with Contextual Bandits
Carousel Personalization in Music Streaming Apps with Contextual Bandits
Walid Bendada
Guillaume Salha-Galvan
Théo Bontempelli
29
56
0
14 Sep 2020
Crush Optimism with Pessimism: Structured Bandits Beyond Asymptotic
  Optimality
Crush Optimism with Pessimism: Structured Bandits Beyond Asymptotic Optimality
Kwang-Sung Jun
Chicheng Zhang
20
10
0
15 Jun 2020
Multi-Armed Bandits with Correlated Arms
Multi-Armed Bandits with Correlated Arms
Samarth Gupta
Shreyas Chaudhari
Gauri Joshi
Osman Yağan
16
50
0
06 Nov 2019
Optimal Exploitation of Clustering and History Information in
  Multi-Armed Bandit
Optimal Exploitation of Clustering and History Information in Multi-Armed Bandit
Djallel Bouneffouf
Srinivasan Parthasarathy
Horst Samulowitz
Martin Wistuba
16
29
0
31 May 2019
Bounded regret in stochastic multi-armed bandits
Bounded regret in stochastic multi-armed bandits
Sébastien Bubeck
Vianney Perchet
Philippe Rigollet
71
91
0
06 Feb 2013
1