Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2007.08620
Cited By
The Monte Carlo Transformer: a stochastic self-attention model for sequence prediction
15 July 2020
Alice Martin
Charles Ollion
Florian Strub
Sylvain Le Corff
Olivier Pietquin
Re-assign community
ArXiv
PDF
HTML
Papers citing
"The Monte Carlo Transformer: a stochastic self-attention model for sequence prediction"
6 / 6 papers shown
Title
TransDreamer: Reinforcement Learning with Transformer World Models
Changgu Chen
Yi-Fu Wu
Jaesik Yoon
Sungjin Ahn
OffRL
32
90
0
19 Feb 2022
Transformer Uncertainty Estimation with Hierarchical Stochastic Attention
Jiahuan Pei
Cheng-Yu Wang
Gyuri Szarvas
24
22
0
27 Dec 2021
Pathologies in priors and inference for Bayesian transformers
Tristan Cinquin
Alexander Immer
Max Horn
Vincent Fortuin
UQCV
BDL
MedIm
34
9
0
08 Oct 2021
Fast and Scalable Bayesian Deep Learning by Weight-Perturbation in Adam
Mohammad Emtiyaz Khan
Didrik Nielsen
Voot Tangkaratt
Wu Lin
Y. Gal
Akash Srivastava
ODL
74
266
0
13 Jun 2018
Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles
Balaji Lakshminarayanan
Alexander Pritzel
Charles Blundell
UQCV
BDL
276
5,661
0
05 Dec 2016
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
285
9,138
0
06 Jun 2015
1