ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2008.01466
  4. Cited By
Taking Notes on the Fly Helps BERT Pre-training

Taking Notes on the Fly Helps BERT Pre-training

4 August 2020
Qiyu Wu
Chen Xing
Yatao Li
Guolin Ke
Di He
Tie-Yan Liu
ArXivPDFHTML

Papers citing "Taking Notes on the Fly Helps BERT Pre-training"

5 / 5 papers shown
Title
Prompt Combines Paraphrase: Teaching Pre-trained Models to Understand
  Rare Biomedical Words
Prompt Combines Paraphrase: Teaching Pre-trained Models to Understand Rare Biomedical Words
Hao Wang
Chi-Liang Liu
Nuwa Xi
Sendong Zhao
Meizhi Ju
Shiwei Zhang
Ziheng Zhang
Yefeng Zheng
Bing Qin
Ting Liu
VLM
AAML
LM&MA
41
6
0
14 Sep 2022
Token Dropping for Efficient BERT Pretraining
Token Dropping for Efficient BERT Pretraining
Le Hou
Richard Yuanzhe Pang
Dinesh Manocha
Yuexin Wu
Xinying Song
Xiaodan Song
Denny Zhou
22
43
0
24 Mar 2022
Combining Transformers with Natural Language Explanations
Combining Transformers with Natural Language Explanations
Federico Ruggeri
Marco Lippi
Paolo Torroni
25
1
0
02 Sep 2021
Pretrained Transformers for Text Ranking: BERT and Beyond
Pretrained Transformers for Text Ranking: BERT and Beyond
Jimmy J. Lin
Rodrigo Nogueira
Andrew Yates
VLM
244
612
0
13 Oct 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
299
6,996
0
20 Apr 2018
1