Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2209.10901
Cited By
Pretraining the Vision Transformer using self-supervised methods for vision based Deep Reinforcement Learning
22 September 2022
Manuel Goulão
Arlindo L. Oliveira
ViT
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Pretraining the Vision Transformer using self-supervised methods for vision based Deep Reinforcement Learning"
10 / 10 papers shown
Title
Uncovering RL Integration in SSL Loss: Objective-Specific Implications for Data-Efficient RL
Ömer Veysel Çağatan
Barış Akgün
OffRL
34
0
0
22 Oct 2024
Unsupervised Salient Patch Selection for Data-Efficient Reinforcement Learning
Zhaohui Jiang
Paul Weng
OffRL
25
0
0
10 Jan 2024
Masked Feature Modelling: Feature Masking for the Unsupervised Pre-training of a Graph Attention Network Block for Bottom-up Video Event Recognition
Dimitrios Daskalakis
Nikolaos Gkalelis
Vasileios Mezaris
38
0
0
24 Aug 2023
Transformers in Reinforcement Learning: A Survey
Pranav Agarwal
A. Rahman
P. St-Charles
Simon J. D. Prince
Samira Ebrahimi Kahou
OffRL
26
19
0
12 Jul 2023
Transformer in Transformer as Backbone for Deep Reinforcement Learning
Hangyu Mao
Rui Zhao
Hao Chen
Jianye Hao
Yiqun Chen
Dong Li
Junge Zhang
Zhen Xiao
OffRL
36
8
0
30 Dec 2022
Improving Policy Learning via Language Dynamics Distillation
Victor Zhong
Jesse Mu
Luke Zettlemoyer
Edward Grefenstette
Tim Rocktaschel
OffRL
47
15
0
30 Sep 2022
Masked Autoencoders Are Scalable Vision Learners
Kaiming He
Xinlei Chen
Saining Xie
Yanghao Li
Piotr Dollár
Ross B. Girshick
ViT
TPM
311
7,457
0
11 Nov 2021
Emerging Properties in Self-Supervised Vision Transformers
Mathilde Caron
Hugo Touvron
Ishan Misra
Hervé Jégou
Julien Mairal
Piotr Bojanowski
Armand Joulin
356
5,811
0
29 Apr 2021
Decoupling Representation Learning from Reinforcement Learning
Adam Stooke
Kimin Lee
Pieter Abbeel
Michael Laskin
SSL
DRL
284
341
0
14 Sep 2020
Improved Baselines with Momentum Contrastive Learning
Xinlei Chen
Haoqi Fan
Ross B. Girshick
Kaiming He
SSL
276
3,375
0
09 Mar 2020
1