ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.13105
  4. Cited By
Two-stage Textual Knowledge Distillation for End-to-End Spoken Language
  Understanding

Two-stage Textual Knowledge Distillation for End-to-End Spoken Language Understanding

25 October 2020
Seongbin Kim
Gyuwan Kim
Seongjin Shin
Sangmin Lee
    VLM
ArXivPDFHTML

Papers citing "Two-stage Textual Knowledge Distillation for End-to-End Spoken Language Understanding"

5 / 5 papers shown
Title
The Interpreter Understands Your Meaning: End-to-end Spoken Language
  Understanding Aided by Speech Translation
The Interpreter Understands Your Meaning: End-to-end Spoken Language Understanding Aided by Speech Translation
Mutian He
Philip N. Garner
44
4
0
16 May 2023
Transformers in Speech Processing: A Survey
Transformers in Speech Processing: A Survey
S. Latif
Aun Zaidi
Heriberto Cuayáhuitl
Fahad Shamshad
Moazzam Shoukat
Junaid Qadir
42
47
0
21 Mar 2023
EPIK: Eliminating multi-model Pipelines with Knowledge-distillation
EPIK: Eliminating multi-model Pipelines with Knowledge-distillation
Bhavesh Laddagiri
Yash Raj
Anshuman Dash
16
0
0
27 Nov 2022
Distilling a Pretrained Language Model to a Multilingual ASR Model
Distilling a Pretrained Language Model to a Multilingual ASR Model
Kwanghee Choi
Hyung-Min Park
VLM
19
10
0
25 Jun 2022
Tokenwise Contrastive Pretraining for Finer Speech-to-BERT Alignment in
  End-to-End Speech-to-Intent Systems
Tokenwise Contrastive Pretraining for Finer Speech-to-BERT Alignment in End-to-End Speech-to-Intent Systems
Vishal Sunder
Eric Fosler-Lussier
Samuel Thomas
H. Kuo
Brian Kingsbury
23
7
0
11 Apr 2022
1