ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.13899
  4. Cited By
Sequence-Level Knowledge Distillation for Class-Incremental End-to-End
  Spoken Language Understanding

Sequence-Level Knowledge Distillation for Class-Incremental End-to-End Spoken Language Understanding

23 May 2023
Umberto Cappellazzo
Muqiao Yang
Daniele Falavigna
A. Brutti
    CLL
    KELM
ArXivPDFHTML

Papers citing "Sequence-Level Knowledge Distillation for Class-Incremental End-to-End Spoken Language Understanding"

4 / 4 papers shown
Title
Finding Task-specific Subnetworks in Multi-task Spoken Language
  Understanding Model
Finding Task-specific Subnetworks in Multi-task Spoken Language Understanding Model
Hayato Futami
Siddhant Arora
Yosuke Kashiwagi
E. Tsunoo
Shinji Watanabe
39
0
0
18 Jun 2024
Unraveling Key Factors of Knowledge Distillation
Jingxuan Wei
Linzhuang Sun
Xu Tan
Bihui Yu
Ruifeng Guo
24
0
0
14 Dec 2023
Continual Contrastive Spoken Language Understanding
Continual Contrastive Spoken Language Understanding
Umberto Cappellazzo
Enrico Fini
Muqiao Yang
Daniele Falavigna
A. Brutti
Bhiksha Raj
CLL
28
1
0
04 Oct 2023
Learning Representations for New Sound Classes With Continual
  Self-Supervised Learning
Learning Representations for New Sound Classes With Continual Self-Supervised Learning
Zhepei Wang
Cem Subakan
Xilin Jiang
Junkai Wu
Efthymios Tzinis
Mirco Ravanelli
Paris Smaragdis
CLL
SSL
67
19
0
15 May 2022
1