Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1206.6454
Cited By
Hierarchical Exploration for Accelerating Contextual Bandits
27 June 2012
Yisong Yue
S. Hong
Carlos Guestrin
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Hierarchical Exploration for Accelerating Contextual Bandits"
7 / 7 papers shown
Title
Bilinear Bandits with Low-rank Structure
Kwang-Sung Jun
Rebecca Willett
S. Wright
Robert D. Nowak
20
60
0
08 Jan 2019
Partially Observable Markov Decision Process for Recommender Systems
Zhongqi Lu
Qiang Yang
30
27
0
28 Aug 2016
On Context-Dependent Clustering of Bandits
Claudio Gentile
Shuai Li
Purushottam Kar
Alexandros Karatzoglou
Evans Etrue
Giovanni Zappella
15
138
0
06 Aug 2016
Latent Contextual Bandits and their Application to Personalized Recommendations for New Users
Li Zhou
Emma Brunskill
19
62
0
22 Apr 2016
Context-Aware Bandits
Shuai Li
Purushottam Kar
41
13
0
12 Oct 2015
A Survey of Online Experiment Design with the Stochastic Multi-Armed Bandit
Giuseppe Burtini
Jason L. Loeppky
Ramon Lawrence
39
119
0
02 Oct 2015
Online Clustering of Bandits
Claudio Gentile
Shuai Li
Giovanni Zappella
36
263
0
31 Jan 2014
1