Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2202.00710
Cited By
Improving Sample Efficiency of Value Based Models Using Attention and Vision Transformers
1 February 2022
Amir Ardalan Kalantari
Mohammad Amini
Sarath Chandar
Doina Precup
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Improving Sample Efficiency of Value Based Models Using Attention and Vision Transformers"
6 / 6 papers shown
Title
Vision Transformers for End-to-End Vision-Based Quadrotor Obstacle Avoidance
Anish Bhattacharya
Nishanth Rao
Dhruv Parikh
Pratik Kunapuli
Nikolai Matni
Vijay R. Kumar
Nikolai Matni
Vijay Kumar
71
6
0
16 May 2024
Unsupervised Salient Patch Selection for Data-Efficient Reinforcement Learning
Zhaohui Jiang
Paul Weng
OffRL
25
0
0
10 Jan 2024
A Survey on Transformers in Reinforcement Learning
Wenzhe Li
Hao Luo
Zichuan Lin
Chongjie Zhang
Zongqing Lu
Deheng Ye
OffRL
MU
AI4CE
37
55
0
08 Jan 2023
Deep Reinforcement Learning with Swin Transformers
Li Meng
Morten Goodwin
Anis Yazidi
P. Engelstad
ViT
29
2
0
30 Jun 2022
Decoupling Representation Learning from Reinforcement Learning
Adam Stooke
Kimin Lee
Pieter Abbeel
Michael Laskin
SSL
DRL
284
341
0
14 Sep 2020
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
284
2,890
0
15 Sep 2016
1