Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2111.03664
Cited By
Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models
5 November 2021
J. Yoon
H. Kim
Hyeon Seung Lee
Sunghwan Ahn
N. Kim
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models"
5 / 5 papers shown
Title
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
152
422
0
19 Apr 2021
Leveraging Acoustic and Linguistic Embeddings from Pretrained speech and language Models for Intent Classification
Bidisha Sharma
Maulik C. Madhavi
Haizhou Li
18
19
0
15 Feb 2021
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
65
143
0
05 Feb 2021
Efficiently Fusing Pretrained Acoustic and Linguistic Encoders for Low-resource Speech Recognition
Cheng Yi
Shiyu Zhou
Bo Xu
51
40
0
17 Jan 2021
NeMo: a toolkit for building AI applications using Neural Modules
Oleksii Kuchaiev
Jason Chun Lok Li
Huyen Nguyen
Oleksii Hrinchuk
Ryan Leary
...
Jack Cook
P. Castonguay
Mariya Popova
Jocelyn Huang
Jonathan M. Cohen
211
292
0
14 Sep 2019
1