Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2406.09444
Cited By
GenDistiller: Distilling Pre-trained Language Models based on an Autoregressive Generative Model
12 June 2024
Yingying Gao
Shilei Zhang
Chao Deng
Junlan Feng
Re-assign community
ArXiv
PDF
HTML
Papers citing
"GenDistiller: Distilling Pre-trained Language Models based on an Autoregressive Generative Model"
1 / 1 papers shown
Title
SpeechT5: Unified-Modal Encoder-Decoder Pre-Training for Spoken Language Processing
Junyi Ao
Rui Wang
Long Zhou
Chengyi Wang
Shuo Ren
...
Yu Zhang
Zhihua Wei
Yao Qian
Jinyu Li
Furu Wei
118
193
0
14 Oct 2021
1