Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.08671
Cited By
To Pretrain or Not to Pretrain: Examining the Benefits of Pretraining on Resource Rich Tasks
15 June 2020
Sinong Wang
Madian Khabsa
Hao Ma
Re-assign community
ArXiv
PDF
HTML
Papers citing
"To Pretrain or Not to Pretrain: Examining the Benefits of Pretraining on Resource Rich Tasks"
6 / 6 papers shown
Title
On the Role of Pre-trained Embeddings in Binary Code Analysis
Alwin Maier
Felix Weissberg
Konrad Rieck
105
0
0
12 Feb 2025
SpanBERT: Improving Pre-training by Representing and Predicting Spans
Mandar Joshi
Danqi Chen
Yinhan Liu
Daniel S. Weld
Luke Zettlemoyer
Omer Levy
130
1,962
0
24 Jul 2019
Rethinking ImageNet Pre-training
Kaiming He
Ross B. Girshick
Piotr Dollár
VLM
SSeg
123
1,083
0
21 Nov 2018
Deep contextualized word representations
Matthew E. Peters
Mark Neumann
Mohit Iyyer
Matt Gardner
Christopher Clark
Kenton Lee
Luke Zettlemoyer
NAI
182
11,542
0
15 Feb 2018
Revisiting Unreasonable Effectiveness of Data in Deep Learning Era
Chen Sun
Abhinav Shrivastava
Saurabh Singh
Abhinav Gupta
VLM
162
2,393
0
10 Jul 2017
Neural Machine Translation of Rare Words with Subword Units
Rico Sennrich
Barry Haddow
Alexandra Birch
195
7,729
0
31 Aug 2015
1