ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.08613
27
0

Enhancing knowledge retention for continual learning with domain-specific adapters and features gating

11 April 2025
Mohamed Abbas Hedjazi
O. Hadjerci
Adel Hafiane
    CLL
ArXivPDFHTML
Abstract

Continual learning empowers models to learn from a continuous stream of data while preserving previously acquired knowledge, effectively addressing the challenge of catastrophic forgetting. In this study, we propose a new approach that integrates adapters within the self-attention mechanisms of Vision Transformers to enhance knowledge retention when sequentially adding datasets from different domains. Unlike previous methods that continue learning with only one dataset, our approach introduces domain-specific output heads and feature gating, allowing the model to maintain high accuracy on previously learned tasks while incorporating only the essential information from multiple domains. The proposed method is compared to prominent parameter-efficient fine-tuning methods in the current state of the art. The results provide evidence that our method effectively alleviates the limitations of previous works. Furthermore, we conduct a comparative analysis using three datasets, CIFAR-100, Flowers102, and DTD, each representing a distinct domain, to investigate the impact of task order on model performance. Our findings underscore the critical role of dataset sequencing in shaping learning outcomes, demonstrating that strategic ordering can significantly improve the model's ability to adapt to evolving data distributions over time while preserving the integrity of previously learned knowledge.

View on arXiv
@article{hedjazi2025_2504.08613,
  title={ Enhancing knowledge retention for continual learning with domain-specific adapters and features gating },
  author={ Mohamed Abbas Hedjazi and Oussama Hadjerci and Adel Hafiane },
  journal={arXiv preprint arXiv:2504.08613},
  year={ 2025 }
}
Comments on this paper