ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.02916
  4. Cited By
MIND: Multi-Task Incremental Network Distillation

MIND: Multi-Task Incremental Network Distillation

5 December 2023
Jacopo Bonato
Francesco Pelosin
Luigi Sabetta
Alessandro Nicolosi
    CLL
ArXivPDFHTML

Papers citing "MIND: Multi-Task Incremental Network Distillation"

4 / 4 papers shown
Title
Componential Prompt-Knowledge Alignment for Domain Incremental Learning
Componential Prompt-Knowledge Alignment for Domain Incremental Learning
Kunlun Xu
Xu Zou
Gang Hua
Jiahuan Zhou
CLL
100
0
0
07 May 2025
Pathological Prior-Guided Multiple Instance Learning For Mitigating Catastrophic Forgetting in Breast Cancer Whole Slide Image Classification
Pathological Prior-Guided Multiple Instance Learning For Mitigating Catastrophic Forgetting in Breast Cancer Whole Slide Image Classification
Weixi Zheng
Aoling Huang. Jingping Yuan
Jingping Yuan
Haoyu Zhao
Zhou Zhao
Yongchao Xu
Thierry Géraud
CLL
68
0
0
08 Mar 2025
Dynamic Integration of Task-Specific Adapters for Class Incremental Learning
Dynamic Integration of Task-Specific Adapters for Class Incremental Learning
Jiashuo Li
Shaokun Wang
Bo Qian
Yuhang He
Xing Wei
Qiang Wang
Yihong Gong
CLL
85
2
0
23 Sep 2024
Learning Fast, Learning Slow: A General Continual Learning Method based
  on Complementary Learning System
Learning Fast, Learning Slow: A General Continual Learning Method based on Complementary Learning System
Elahe Arani
F. Sarfraz
Bahram Zonooz
CLL
97
128
0
29 Jan 2022
1