ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.09444
  4. Cited By
GenDistiller: Distilling Pre-trained Language Models based on an
  Autoregressive Generative Model

GenDistiller: Distilling Pre-trained Language Models based on an Autoregressive Generative Model

12 June 2024
Yingying Gao
Shilei Zhang
Chao Deng
Junlan Feng
ArXivPDFHTML

Papers citing "GenDistiller: Distilling Pre-trained Language Models based on an Autoregressive Generative Model"

1 / 1 papers shown
Title
SpeechT5: Unified-Modal Encoder-Decoder Pre-Training for Spoken Language
  Processing
SpeechT5: Unified-Modal Encoder-Decoder Pre-Training for Spoken Language Processing
Junyi Ao
Rui Wang
Long Zhou
Chengyi Wang
Shuo Ren
...
Yu Zhang
Zhihua Wei
Yao Qian
Jinyu Li
Furu Wei
118
193
0
14 Oct 2021
1