Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2010.12148
Cited By
ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding
23 October 2020
Dongling Xiao
Yukun Li
Han Zhang
Yu Sun
Hao Tian
Hua-Hong Wu
Haifeng Wang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding"
5 / 5 papers shown
Title
Unsupervised Boundary-Aware Language Model Pretraining for Chinese Sequence Labeling
Peijie Jiang
Dingkun Long
Yanzhao Zhang
Pengjun Xie
Meishan Zhang
M. Zhang
SSL
30
12
0
27 Oct 2022
CLOWER: A Pre-trained Language Model with Contrastive Learning over Word and Character Representations
Borun Chen
Hongyin Tang
Jiahao Bu
Kai Zhang
Jingang Wang
Qifan Wang
Haitao Zheng
Wei Yu Wu
Liqian Yu
VLM
27
1
0
23 Aug 2022
Analysing the Effect of Masking Length Distribution of MLM: An Evaluation Framework and Case Study on Chinese MRC Datasets
Changchang Zeng
Shaobo Li
21
6
0
29 Sep 2021
Pre-train, Prompt, and Predict: A Systematic Survey of Prompting Methods in Natural Language Processing
Pengfei Liu
Weizhe Yuan
Jinlan Fu
Zhengbao Jiang
Hiroaki Hayashi
Graham Neubig
VLM
SyDa
37
3,831
0
28 Jul 2021
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
1