ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2412.11074
111
0

Adapter-Enhanced Semantic Prompting for Continual Learning

15 December 2024
Baocai Yin
Ji Zhao
Huajie Jiang
Ningning Hou
Yongli Hu
Amin Beheshti
Ming-Hsuan Yang
Yuankai Qi
    CLL
    VLM
ArXivPDFHTML
Abstract

Continual learning (CL) enables models to adapt to evolving data streams. A major challenge of CL is catastrophic forgetting, where new knowledge will overwrite previously acquired knowledge. Traditional methods usually retain the past data for replay or add additional branches in the model to learn new knowledge, which has high memory requirements. In this paper, we propose a novel lightweight CL framework, Adapter-Enhanced Semantic Prompting (AESP), which integrates prompt tuning and adapter techniques. Specifically, we design semantic-guided prompts to enhance the generalization ability of visual features and utilize adapters to efficiently fuse the semantic information, aiming to learn more adaptive features for the continual learning task. Furthermore, to choose the right task prompt for feature adaptation, we have developed a novel matching mechanism for prompt selection. Extensive experiments on three CL datasets demonstrate that our approach achieves favorable performance across multiple metrics, showing its potential for advancing CL.

View on arXiv
@article{yin2025_2412.11074,
  title={ Adapter-Enhanced Semantic Prompting for Continual Learning },
  author={ Baocai Yin and Ji Zhao and Huajie Jiang and Ningning Hou and Yongli Hu and Amin Beheshti and Ming-Hsuan Yang and Yuankai Qi },
  journal={arXiv preprint arXiv:2412.11074},
  year={ 2025 }
}
Comments on this paper