Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1907.06226
Cited By
Lexical Simplification with Pretrained Encoders
14 July 2019
Jipeng Qiang
Yun Li
Yi Zhu
Yunhao Yuan
Xindong Wu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Lexical Simplification with Pretrained Encoders"
7 / 7 papers shown
Title
BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model
Alex Jinpeng Wang
Kyunghyun Cho
VLM
64
353
0
11 Feb 2019
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Jinhyuk Lee
Wonjin Yoon
Sungdong Kim
Donghyeon Kim
Sunkyu Kim
Chan Ho So
Jaewoo Kang
OOD
137
5,628
0
25 Jan 2019
Cross-lingual Language Model Pretraining
Guillaume Lample
Alexis Conneau
73
2,735
0
22 Jan 2019
A Word-Complexity Lexicon and A Neural Readability Ranking Model for Lexical Simplification
Mounica Maddela
Wenyuan Xu
55
78
0
12 Oct 2018
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Jacob Devlin
Ming-Wei Chang
Kenton Lee
Kristina Toutanova
VLM
SSL
SSeg
1.5K
94,511
0
11 Oct 2018
Attention Is All You Need
Ashish Vaswani
Noam M. Shazeer
Niki Parmar
Jakob Uszkoreit
Llion Jones
Aidan Gomez
Lukasz Kaiser
Illia Polosukhin
3DV
628
130,942
0
12 Jun 2017
For the sake of simplicity: Unsupervised extraction of lexical simplifications from Wikipedia
Mark Yatskar
B. Pang
Cristian Danescu-Niculescu-Mizil
Lillian Lee
75
179
0
11 Aug 2010
1