Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1908.01851
Cited By
Self-Knowledge Distillation in Natural Language Processing
2 August 2019
Sangchul Hahn
Heeyoul Choi
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Self-Knowledge Distillation in Natural Language Processing"
11 / 61 papers shown
Title
Transformer-based ASR Incorporating Time-reduction Layer and Fine-tuning with Self-Knowledge Distillation
Md. Akmal Haidar
Chao Xing
Mehdi Rezagholizadeh
27
7
0
17 Mar 2021
Large-Scale Generative Data-Free Distillation
Liangchen Luo
Mark Sandler
Zi Lin
A. Zhmoginov
Andrew G. Howard
24
43
0
10 Dec 2020
Towards Data Distillation for End-to-end Spoken Conversational Question Answering
Chenyu You
Nuo Chen
Fenglin Liu
Dongchao Yang
Yuexian Zou
22
45
0
18 Oct 2020
Neighbourhood Distillation: On the benefits of non end-to-end distillation
Laetitia Shao
Max Moroz
Elad Eban
Yair Movshovitz-Attias
ODL
18
0
0
02 Oct 2020
Weight Distillation: Transferring the Knowledge in Neural Network Parameters
Ye Lin
Yanyang Li
Ziyang Wang
Bei Li
Quan Du
Tong Xiao
Jingbo Zhu
6
23
0
19 Sep 2020
Noisy Self-Knowledge Distillation for Text Summarization
Yang Liu
S. Shen
Mirella Lapata
33
44
0
15 Sep 2020
Tackling the Unannotated: Scene Graph Generation with Bias-Reduced Models
Tong Wang
Selen Pehlivan
Jorma T. Laaksonen
29
34
0
18 Aug 2020
Self-Knowledge Distillation with Progressive Refinement of Targets
Kyungyul Kim
Byeongmoon Ji
Doyoung Yoon
Sangheum Hwang
ODL
24
177
0
22 Jun 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
28
2,857
0
09 Jun 2020
Knowledge Distillation for Multilingual Unsupervised Neural Machine Translation
Haipeng Sun
Rui Wang
Kehai Chen
Masao Utiyama
Eiichiro Sumita
Tiejun Zhao
AIMat
23
47
0
21 Apr 2020
A Strong Baseline for Learning Cross-Lingual Word Embeddings from Sentence Alignments
Omer Levy
Anders Søgaard
Yoav Goldberg
39
66
0
18 Aug 2016
Previous
1
2