Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2010.13105
Cited By
Two-stage Textual Knowledge Distillation for End-to-End Spoken Language Understanding
25 October 2020
Seongbin Kim
Gyuwan Kim
Seongjin Shin
Sangmin Lee
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Two-stage Textual Knowledge Distillation for End-to-End Spoken Language Understanding"
5 / 5 papers shown
Title
The Interpreter Understands Your Meaning: End-to-end Spoken Language Understanding Aided by Speech Translation
Mutian He
Philip N. Garner
44
4
0
16 May 2023
Transformers in Speech Processing: A Survey
S. Latif
Aun Zaidi
Heriberto Cuayáhuitl
Fahad Shamshad
Moazzam Shoukat
Junaid Qadir
42
47
0
21 Mar 2023
EPIK: Eliminating multi-model Pipelines with Knowledge-distillation
Bhavesh Laddagiri
Yash Raj
Anshuman Dash
16
0
0
27 Nov 2022
Distilling a Pretrained Language Model to a Multilingual ASR Model
Kwanghee Choi
Hyung-Min Park
VLM
19
10
0
25 Jun 2022
Tokenwise Contrastive Pretraining for Finer Speech-to-BERT Alignment in End-to-End Speech-to-Intent Systems
Vishal Sunder
Eric Fosler-Lussier
Samuel Thomas
H. Kuo
Brian Kingsbury
23
7
0
11 Apr 2022
1