Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2008.01466
Cited By
Taking Notes on the Fly Helps BERT Pre-training
4 August 2020
Qiyu Wu
Chen Xing
Yatao Li
Guolin Ke
Di He
Tie-Yan Liu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Taking Notes on the Fly Helps BERT Pre-training"
5 / 5 papers shown
Title
Prompt Combines Paraphrase: Teaching Pre-trained Models to Understand Rare Biomedical Words
Hao Wang
Chi-Liang Liu
Nuwa Xi
Sendong Zhao
Meizhi Ju
Shiwei Zhang
Ziheng Zhang
Yefeng Zheng
Bing Qin
Ting Liu
VLM
AAML
LM&MA
41
6
0
14 Sep 2022
Token Dropping for Efficient BERT Pretraining
Le Hou
Richard Yuanzhe Pang
Dinesh Manocha
Yuexin Wu
Xinying Song
Xiaodan Song
Denny Zhou
22
43
0
24 Mar 2022
Combining Transformers with Natural Language Explanations
Federico Ruggeri
Marco Lippi
Paolo Torroni
25
1
0
02 Sep 2021
Pretrained Transformers for Text Ranking: BERT and Beyond
Jimmy J. Lin
Rodrigo Nogueira
Andrew Yates
VLM
244
612
0
13 Oct 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
299
6,996
0
20 Apr 2018
1