ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.09842
  4. Cited By
On the compression of shallow non-causal ASR models using knowledge
  distillation and tied-and-reduced decoder for low-latency on-device speech
  recognition

On the compression of shallow non-causal ASR models using knowledge distillation and tied-and-reduced decoder for low-latency on-device speech recognition

15 December 2023
Nagaraj Adiga
Jinhwan Park
Chintigari Shiva Kumar
Shatrughan Singh
Kyungmin Lee
Chanwoo Kim
Dhananjaya N. Gowda
ArXivPDFHTML

Papers citing "On the compression of shallow non-causal ASR models using knowledge distillation and tied-and-reduced decoder for low-latency on-device speech recognition"

Title
No papers