Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2106.13715
Cited By
Learning to Sample Replacements for ELECTRA Pre-Training
25 June 2021
Y. Hao
Li Dong
Hangbo Bao
Ke Xu
Furu Wei
MU
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Learning to Sample Replacements for ELECTRA Pre-Training"
9 / 9 papers shown
Title
Pre-Training Transformers as Energy-Based Cloze Models
Kevin Clark
Minh-Thang Luong
Quoc V. Le
Christopher D. Manning
54
80
0
15 Dec 2020
Calibrating Deep Neural Networks using Focal Loss
Jishnu Mukhoti
Viveka Kulharia
Amartya Sanyal
Stuart Golodetz
Philip Torr
P. Dokania
UQCV
81
459
0
21 Feb 2020
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
Zhenzhong Lan
Mingda Chen
Sebastian Goodman
Kevin Gimpel
Piyush Sharma
Radu Soricut
SSL
AIMat
338
6,441
0
26 Sep 2019
Unified Language Model Pre-training for Natural Language Understanding and Generation
Li Dong
Nan Yang
Wenhui Wang
Furu Wei
Xiaodong Liu
Yu Wang
Jianfeng Gao
M. Zhou
H. Hon
ELM
AI4CE
202
1,555
0
08 May 2019
Neural Network Acceptability Judgments
Alex Warstadt
Amanpreet Singh
Samuel R. Bowman
226
1,407
0
31 May 2018
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
1.0K
7,152
0
20 Apr 2018
Deep contextualized word representations
Matthew E. Peters
Mark Neumann
Mohit Iyyer
Matt Gardner
Christopher Clark
Kenton Lee
Luke Zettlemoyer
NAI
200
11,542
0
15 Feb 2018
A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference
Adina Williams
Nikita Nangia
Samuel R. Bowman
511
4,476
0
18 Apr 2017
SQuAD: 100,000+ Questions for Machine Comprehension of Text
Pranav Rajpurkar
Jian Zhang
Konstantin Lopyrev
Percy Liang
RALM
250
8,124
0
16 Jun 2016
1