ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.09437
  4. Cited By
RobustDistiller: Compressing Universal Speech Representations for
  Enhanced Environment Robustness

RobustDistiller: Compressing Universal Speech Representations for Enhanced Environment Robustness

18 February 2023
Heitor R. Guimarães
Arthur Pimentel
Anderson R. Avila
Mehdi Rezagholizadeh
Boxing Chen
Tiago H. Falk
ArXivPDFHTML

Papers citing "RobustDistiller: Compressing Universal Speech Representations for Enhanced Environment Robustness"

2 / 2 papers shown
Title
Compressing Transformer-based self-supervised models for speech
  processing
Compressing Transformer-based self-supervised models for speech processing
Tzu-Quan Lin
Tsung-Huan Yang
Chun-Yao Chang
Kuang-Ming Chen
Tzu-hsun Feng
Hung-yi Lee
Hao Tang
37
6
0
17 Nov 2022
SNIPER Training: Single-Shot Sparse Training for Text-to-Speech
SNIPER Training: Single-Shot Sparse Training for Text-to-Speech
Perry Lam
Huayun Zhang
Nancy F. Chen
Berrak Sisman
Dorien Herremans
VLM
21
0
0
14 Nov 2022
1