ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.10529
  4. Cited By
SpanBERT: Improving Pre-training by Representing and Predicting Spans
v1v2v3 (latest)

SpanBERT: Improving Pre-training by Representing and Predicting Spans

24 July 2019
Mandar Joshi
Danqi Chen
Yinhan Liu
Daniel S. Weld
Luke Zettlemoyer
Omer Levy
ArXiv (abs)PDFHTML

Papers citing "SpanBERT: Improving Pre-training by Representing and Predicting Spans"

5 / 955 papers shown
Title
RoBERTa: A Robustly Optimized BERT Pretraining Approach
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Yinhan Liu
Myle Ott
Naman Goyal
Jingfei Du
Mandar Joshi
Danqi Chen
Omer Levy
M. Lewis
Luke Zettlemoyer
Veselin Stoyanov
AIMat
794
24,615
0
26 Jul 2019
BERTphone: Phonetically-Aware Encoder Representations for
  Utterance-Level Speaker and Language Recognition
BERTphone: Phonetically-Aware Encoder Representations for Utterance-Level Speaker and Language Recognition
Shaoshi Ling
Julian Salazar
Yuzong Liu
Katrin Kirchhoff
SSL
93
28
0
30 Jun 2019
Pre-Training with Whole Word Masking for Chinese BERT
Pre-Training with Whole Word Masking for Chinese BERT
Yiming Cui
Wanxiang Che
Ting Liu
Bing Qin
Ziqing Yang
53
186
0
19 Jun 2019
Better Long-Range Dependency By Bootstrapping A Mutual Information
  Regularizer
Better Long-Range Dependency By Bootstrapping A Mutual Information Regularizer
Yanshuai Cao
Peng Xu
26
2
0
28 May 2019
Dual Co-Matching Network for Multi-choice Reading Comprehension
Dual Co-Matching Network for Multi-choice Reading Comprehension
Shuailiang Zhang
Zhao Hai
Yuwei Wu
Zhuosheng Zhang
Xi Zhou
Xiaoping Zhou
102
131
0
27 Jan 2019
Previous
123...181920