Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2303.03600
Cited By
Adaptive Knowledge Distillation between Text and Speech Pre-trained Models
7 March 2023
Jinjie Ni
Yukun Ma
Wen Wang
Qian Chen
Dianwen Ng
Han Lei
Trung Hieu Nguyen
Chong Zhang
B. Ma
Min Zhang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Adaptive Knowledge Distillation between Text and Speech Pre-trained Models"
1 / 1 papers shown
Title
SpeechT5: Unified-Modal Encoder-Decoder Pre-Training for Spoken Language Processing
Junyi Ao
Rui Wang
Long Zhou
Chengyi Wang
Shuo Ren
...
Yu Zhang
Zhihua Wei
Yao Qian
Jinyu Li
Furu Wei
118
193
0
14 Oct 2021
1