Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2012.06048
Cited By
Reinforced Multi-Teacher Selection for Knowledge Distillation
11 December 2020
Fei Yuan
Linjun Shou
J. Pei
Wutao Lin
Ming Gong
Yan Fu
Daxin Jiang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Reinforced Multi-Teacher Selection for Knowledge Distillation"
7 / 7 papers shown
Title
Heuristic-Free Multi-Teacher Learning
Huy Thong Nguyen
En-Hung Chu
Lenord Melvix
Jazon Jiao
Chunglin Wen
Benjamin Louie
97
0
0
19 Nov 2024
MIDAS: Multi-level Intent, Domain, And Slot Knowledge Distillation for Multi-turn NLU
Yan Li
So-Eon Kim
Seong-Bae Park
S. Han
59
1
0
15 Aug 2024
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
Zhenzhong Lan
Mingda Chen
Sebastian Goodman
Kevin Gimpel
Piyush Sharma
Radu Soricut
SSL
AIMat
290
6,420
0
26 Sep 2019
TinyBERT: Distilling BERT for Natural Language Understanding
Xiaoqi Jiao
Yichun Yin
Lifeng Shang
Xin Jiang
Xiao Chen
Linlin Li
F. Wang
Qun Liu
VLM
75
1,847
0
23 Sep 2019
A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference
Adina Williams
Nikita Nangia
Samuel R. Bowman
454
4,444
0
18 Apr 2017
SQuAD: 100,000+ Questions for Machine Comprehension of Text
Pranav Rajpurkar
Jian Zhang
Konstantin Lopyrev
Percy Liang
RALM
182
8,067
0
16 Jun 2016
Learning both Weights and Connections for Efficient Neural Networks
Song Han
Jeff Pool
J. Tran
W. Dally
CVBM
269
6,628
0
08 Jun 2015
1