Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1907.10529
Cited By
v1
v2
v3 (latest)
SpanBERT: Improving Pre-training by Representing and Predicting Spans
24 July 2019
Mandar Joshi
Danqi Chen
Yinhan Liu
Daniel S. Weld
Luke Zettlemoyer
Omer Levy
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"SpanBERT: Improving Pre-training by Representing and Predicting Spans"
5 / 955 papers shown
Title
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
794
24,615
0
26 Jul 2019
BERTphone: Phonetically-Aware Encoder Representations for Utterance-Level Speaker and Language Recognition
Shaoshi Ling
Julian Salazar
Yuzong Liu
Katrin Kirchhoff
SSL
93
28
0
30 Jun 2019
Pre-Training with Whole Word Masking for Chinese BERT
Yiming Cui
Wanxiang Che
Ting Liu
Bing Qin
Ziqing Yang
53
186
0
19 Jun 2019
Better Long-Range Dependency By Bootstrapping A Mutual Information Regularizer
Yanshuai Cao
Peng Xu
26
2
0
28 May 2019
Dual Co-Matching Network for Multi-choice Reading Comprehension
Shuailiang Zhang
Zhao Hai
Yuwei Wu
Zhuosheng Zhang
Xi Zhou
Xiaoping Zhou
102
131
0
27 Jan 2019
Previous
1
2
3
...
18
19
20