38
0

Learning without Isolation: Pathway Protection for Continual Learning

Main:8 Pages
6 Figures
Bibliography:3 Pages
20 Tables
Appendix:12 Pages
Abstract

Deep networks are prone to catastrophic forgetting during sequential task learning, i.e., losing the knowledge about old tasks upon learning new tasks. To this end, continual learning(CL) has emerged, whose existing methods focus mostly on regulating or protecting the parameters associated with the previous tasks. However, parameter protection is often impractical, since the size of parameters for storing the old-task knowledge increases linearly with the number of tasks, otherwise it is hard to preserve the parameters related to the old-task knowledge. In this work, we bring a dual opinion from neuroscience and physics to CL: in the whole networks, the pathways matter more than the parameters when concerning the knowledge acquired from the old tasks. Following this opinion, we propose a novel CL framework, learning without isolation(LwI), where model fusion is formulated as graph matching and the pathways occupied by the old tasks are protected without being isolated. Thanks to the sparsity of activation channels in a deep network, LwI can adaptively allocate available pathways for a new task, realizing pathway protection and addressing catastrophic forgetting in a parameter-efficient manner. Experiments on popular benchmark datasets demonstrate the superiority of the proposed LwI.

View on arXiv
@article{chen2025_2505.18568,
  title={ Learning without Isolation: Pathway Protection for Continual Learning },
  author={ Zhikang Chen and Abudukelimu Wuerkaixi and Sen Cui and Haoxuan Li and Ding Li and Jingfeng Zhang and Bo Han and Gang Niu and Houfang Liu and Yi Yang and Sifan Yang and Changshui Zhang and Tianling Ren },
  journal={arXiv preprint arXiv:2505.18568},
  year={ 2025 }
}
Comments on this paper