Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2004.03846
Cited By
Structure-Level Knowledge Distillation For Multilingual Sequence Labeling
8 April 2020
Xinyu Wang
Yong Jiang
Nguyen Bach
Tao Wang
Fei Huang
Kewei Tu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Structure-Level Knowledge Distillation For Multilingual Sequence Labeling"
9 / 9 papers shown
Title
Unsupervised Cross-lingual Representation Learning at Scale
Alexis Conneau
Kartikay Khandelwal
Naman Goyal
Vishrav Chaudhary
Guillaume Wenzek
Francisco Guzmán
Edouard Grave
Myle Ott
Luke Zettlemoyer
Veselin Stoyanov
126
6,454
0
05 Nov 2019
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Victor Sanh
Lysandre Debut
Julien Chaumond
Thomas Wolf
87
7,386
0
02 Oct 2019
TinyBERT: Distilling BERT for Natural Language Understanding
Xiaoqi Jiao
Yichun Yin
Lifeng Shang
Xin Jiang
Xiao Chen
Linlin Li
F. Wang
Qun Liu
VLM
38
1,838
0
23 Sep 2019
What Matters for Neural Cross-Lingual Named Entity Recognition: An Empirical Analysis
Xiaolei Huang
Jonathan May
Nanyun Peng
19
10
0
09 Sep 2019
Adversarial Learning with Contextual Embeddings for Zero-resource Cross-lingual Classification and NER
Phillip Keung
Y. Lu
Vikas Bhardwaj
52
81
0
31 Aug 2019
Small and Practical BERT Models for Sequence Labeling
Henry Tsai
Jason Riesa
Melvin Johnson
N. Arivazhagan
Xin Li
Amelia Archer
VLM
22
121
0
31 Aug 2019
Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT
Shijie Wu
Mark Dredze
VLM
SSeg
55
675
0
19 Apr 2019
To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks
Matthew E. Peters
Sebastian Ruder
Noah A. Smith
52
435
0
14 Mar 2019
FitNets: Hints for Thin Deep Nets
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
199
3,862
0
19 Dec 2014
1