Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2004.09297
Cited By
MPNet: Masked and Permuted Pre-training for Language Understanding
20 April 2020
Kaitao Song
Xu Tan
Tao Qin
Jianfeng Lu
Tie-Yan Liu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MPNet: Masked and Permuted Pre-training for Language Understanding"
14 / 164 papers shown
Title
Explainable Fact-checking through Question Answering
Jing Yang
D. Vega-Oliveros
Taís Seibt
Anderson de Rezende Rocha
HILM
27
14
0
11 Oct 2021
Automatic Generation of Word Problems for Academic Education via Natural Language Processing (NLP)
Stanley Uros Keller
30
7
0
27 Sep 2021
Multiplicative Position-aware Transformer Models for Language Understanding
Zhiheng Huang
Davis Liang
Peng Xu
Bing Xiang
9
1
0
27 Sep 2021
Small-Bench NLP: Benchmark for small single GPU trained models in Natural Language Processing
K. Kanakarajan
Bhuvana Kundumani
Malaikannan Sankarasubbu
ALM
MoE
11
5
0
22 Sep 2021
Exploring the Promises of Transformer-Based LMs for the Representation of Normative Claims in the Legal Domain
Reto Gubelmann
Peter Hongler
Siegfried Handschuh
AILaw
14
0
0
25 Aug 2021
Improved Text Classification via Contrastive Adversarial Training
Lin Pan
Chung-Wei Hang
Avirup Sil
Saloni Potdar
AAML
26
86
0
21 Jul 2021
SCARF: Self-Supervised Contrastive Learning using Random Feature Corruption
Dara Bahri
Heinrich Jiang
Yi Tay
Donald Metzler
SSL
21
163
0
29 Jun 2021
Pre-Trained Models: Past, Present and Future
Xu Han
Zhengyan Zhang
Ning Ding
Yuxian Gu
Xiao Liu
...
Jie Tang
Ji-Rong Wen
Jinhui Yuan
Wayne Xin Zhao
Jun Zhu
AIFin
MQ
AI4MH
58
815
0
14 Jun 2021
MusicBERT: Symbolic Music Understanding with Large-Scale Pre-Training
Mingliang Zeng
Xu Tan
Rui Wang
Zeqian Ju
Tao Qin
Tie-Yan Liu
19
128
0
10 Jun 2021
Diverse Image Inpainting with Bidirectional and Autoregressive Transformers
Yingchen Yu
Fangneng Zhan
Rongliang Wu
Jianxiong Pan
Kaiwen Cui
Shijian Lu
Feiying Ma
Xuansong Xie
Chunyan Miao
ViT
40
150
0
26 Apr 2021
ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding
Dongling Xiao
Yukun Li
Han Zhang
Yu Sun
Hao Tian
Hua Wu
Haifeng Wang
27
38
0
23 Oct 2020
Improve Transformer Models with Better Relative Position Embeddings
Zhiheng Huang
Davis Liang
Peng Xu
Bing Xiang
ViT
15
127
0
28 Sep 2020
Funnel-Transformer: Filtering out Sequential Redundancy for Efficient Language Processing
Zihang Dai
Guokun Lai
Yiming Yang
Quoc V. Le
48
230
0
05 Jun 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
299
6,984
0
20 Apr 2018
Previous
1
2
3
4