ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.13977
  4. Cited By
Risk averse non-stationary multi-armed bandits

Risk averse non-stationary multi-armed bandits

28 September 2021
Leo Benac
Frédéric Godin
ArXivPDFHTML

Papers citing "Risk averse non-stationary multi-armed bandits"

8 / 8 papers shown
Title
Optimal Thompson Sampling strategies for support-aware CVaR bandits
Optimal Thompson Sampling strategies for support-aware CVaR bandits
Dorian Baudry
Romain Gautron
E. Kaufmann
Odalric-Ambrym Maillard
27
33
0
10 Dec 2020
Risk-Constrained Thompson Sampling for CVaR Bandits
Risk-Constrained Thompson Sampling for CVaR Bandits
Joel Q. L. Chang
Qiuyu Zhu
Vincent Y. F. Tan
28
13
0
16 Nov 2020
Statistically Robust, Risk-Averse Best Arm Identification in Multi-Armed
  Bandits
Statistically Robust, Risk-Averse Best Arm Identification in Multi-Armed Bandits
Anmol Kagrecha
Jayakrishnan Nair
Krishna Jagannathan
26
6
0
28 Aug 2020
Risk-Averse Action Selection Using Extreme Value Theory Estimates of the
  CVaR
Risk-Averse Action Selection Using Extreme Value Theory Estimates of the CVaR
Dylan Troop
Frédéric Godin
Jia Yuan Yu
23
5
0
03 Dec 2019
A Survey on Practical Applications of Multi-Armed and Contextual Bandits
A Survey on Practical Applications of Multi-Armed and Contextual Bandits
Djallel Bouneffouf
Irina Rish
33
122
0
02 Apr 2019
Concentration bounds for CVaR estimation: The cases of light-tailed and
  heavy-tailed distributions
Concentration bounds for CVaR estimation: The cases of light-tailed and heavy-tailed distributions
A. PrashanthL.
Krishna Jagannathan
R. Kolla
24
13
0
04 Jan 2019
Concentration bounds for empirical conditional value-at-risk: The
  unbounded case
Concentration bounds for empirical conditional value-at-risk: The unbounded case
R. Kolla
Prashanth L.A.
Sanjay P. Bhat
Krishna Jagannathan
20
50
0
06 Aug 2018
Exploration vs Exploitation vs Safety: Risk-averse Multi-Armed Bandits
Exploration vs Exploitation vs Safety: Risk-averse Multi-Armed Bandits
Nicolas Galichet
Michèle Sebag
O. Teytaud
56
115
0
06 Jan 2014
1