ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.13750
  4. Cited By
MoLE : Mixture of Language Experts for Multi-Lingual Automatic Speech
  Recognition

MoLE : Mixture of Language Experts for Multi-Lingual Automatic Speech Recognition

27 February 2023
Yoohwan Kwon
Soo-Whan Chung
    MoE
ArXivPDFHTML

Papers citing "MoLE : Mixture of Language Experts for Multi-Lingual Automatic Speech Recognition"

5 / 5 papers shown
Title
Efficiently Democratizing Medical LLMs for 50 Languages via a Mixture of Language Family Experts
Efficiently Democratizing Medical LLMs for 50 Languages via a Mixture of Language Family Experts
Guorui Zheng
Xidong Wang
Juhao Liang
Nuo Chen
Yuping Zheng
Benyou Wang
MoE
35
5
0
14 Oct 2024
LAE-ST-MoE: Boosted Language-Aware Encoder Using Speech Translation
  Auxiliary Task for E2E Code-switching ASR
LAE-ST-MoE: Boosted Language-Aware Encoder Using Speech Translation Auxiliary Task for E2E Code-switching ASR
Guodong Ma
Wenxuan Wang
Yuke Li
Yuting Yang
Binbin Du
Haoran Fu
28
5
0
28 Sep 2023
Multilingual Speech Recognition using Knowledge Transfer across Learning
  Processes
Multilingual Speech Recognition using Knowledge Transfer across Learning Processes
Rimita Lahiri
K. Kumatani
Eric Sun
Yao Qian
55
6
0
15 Oct 2021
A Configurable Multilingual Model is All You Need to Recognize All
  Languages
A Configurable Multilingual Model is All You Need to Recognize All Languages
Long Zhou
Jinyu Li
Eric Sun
Shujie Liu
92
40
0
13 Jul 2021
Signal Transformer: Complex-valued Attention and Meta-Learning for
  Signal Recognition
Signal Transformer: Complex-valued Attention and Meta-Learning for Signal Recognition
Yihong Dong
Ying Peng
Muqiao Yang
Songtao Lu
Qingjiang Shi
40
9
0
05 Jun 2021
1