ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.03664
  4. Cited By
Oracle Teacher: Leveraging Target Information for Better Knowledge
  Distillation of CTC Models

Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models

5 November 2021
J. Yoon
H. Kim
Hyeon Seung Lee
Sunghwan Ahn
N. Kim
ArXivPDFHTML

Papers citing "Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models"

5 / 5 papers shown
Title
Distilling Knowledge via Knowledge Review
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
155
422
0
19 Apr 2021
Leveraging Acoustic and Linguistic Embeddings from Pretrained speech and
  language Models for Intent Classification
Leveraging Acoustic and Linguistic Embeddings from Pretrained speech and language Models for Intent Classification
Bidisha Sharma
Maulik C. Madhavi
Haizhou Li
21
19
0
15 Feb 2021
Show, Attend and Distill:Knowledge Distillation via Attention-based
  Feature Matching
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
65
143
0
05 Feb 2021
Efficiently Fusing Pretrained Acoustic and Linguistic Encoders for
  Low-resource Speech Recognition
Efficiently Fusing Pretrained Acoustic and Linguistic Encoders for Low-resource Speech Recognition
Cheng Yi
Shiyu Zhou
Bo Xu
51
40
0
17 Jan 2021
NeMo: a toolkit for building AI applications using Neural Modules
NeMo: a toolkit for building AI applications using Neural Modules
Oleksii Kuchaiev
Jason Chun Lok Li
Huyen Nguyen
Oleksii Hrinchuk
Ryan Leary
...
Jack Cook
P. Castonguay
Mariya Popova
Jocelyn Huang
Jonathan M. Cohen
211
292
0
14 Sep 2019
1