Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2009.14822
Cited By
Pea-KD: Parameter-efficient and Accurate Knowledge Distillation on BERT
30 September 2020
Ikhyun Cho
U. Kang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Pea-KD: Parameter-efficient and Accurate Knowledge Distillation on BERT"
4 / 4 papers shown
Title
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Victor Sanh
Lysandre Debut
Julien Chaumond
Thomas Wolf
119
7,386
0
02 Oct 2019
TinyBERT: Distilling BERT for Natural Language Understanding
Xiaoqi Jiao
Yichun Yin
Lifeng Shang
Xin Jiang
Xiao Chen
Linlin Li
F. Wang
Qun Liu
VLM
56
1,838
0
23 Sep 2019
SQuAD: 100,000+ Questions for Machine Comprehension of Text
Pranav Rajpurkar
Jian Zhang
Konstantin Lopyrev
Percy Liang
RALM
144
8,067
0
16 Jun 2016
Learning both Weights and Connections for Efficient Neural Networks
Song Han
Jeff Pool
J. Tran
W. Dally
CVBM
224
6,628
0
08 Jun 2015
1