ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1206.6454
  4. Cited By
Hierarchical Exploration for Accelerating Contextual Bandits

Hierarchical Exploration for Accelerating Contextual Bandits

27 June 2012
Yisong Yue
S. Hong
Carlos Guestrin
ArXivPDFHTML

Papers citing "Hierarchical Exploration for Accelerating Contextual Bandits"

7 / 7 papers shown
Title
Bilinear Bandits with Low-rank Structure
Bilinear Bandits with Low-rank Structure
Kwang-Sung Jun
Rebecca Willett
S. Wright
Robert D. Nowak
20
60
0
08 Jan 2019
Partially Observable Markov Decision Process for Recommender Systems
Partially Observable Markov Decision Process for Recommender Systems
Zhongqi Lu
Qiang Yang
30
27
0
28 Aug 2016
On Context-Dependent Clustering of Bandits
On Context-Dependent Clustering of Bandits
Claudio Gentile
Shuai Li
Purushottam Kar
Alexandros Karatzoglou
Evans Etrue
Giovanni Zappella
15
138
0
06 Aug 2016
Latent Contextual Bandits and their Application to Personalized
  Recommendations for New Users
Latent Contextual Bandits and their Application to Personalized Recommendations for New Users
Li Zhou
Emma Brunskill
19
62
0
22 Apr 2016
Context-Aware Bandits
Shuai Li
Purushottam Kar
41
13
0
12 Oct 2015
A Survey of Online Experiment Design with the Stochastic Multi-Armed
  Bandit
A Survey of Online Experiment Design with the Stochastic Multi-Armed Bandit
Giuseppe Burtini
Jason L. Loeppky
Ramon Lawrence
39
119
0
02 Oct 2015
Online Clustering of Bandits
Online Clustering of Bandits
Claudio Gentile
Shuai Li
Giovanni Zappella
36
263
0
31 Jan 2014
1