15
0

Unlocking the Power of Rehearsal in Continual Learning: A Theoretical Perspective

Main:8 Pages
4 Figures
Bibliography:3 Pages
14 Tables
Appendix:39 Pages
Abstract

Rehearsal-based methods have shown superior performance in addressing catastrophic forgetting in continual learning (CL) by storing and training on a subset of past data alongside new data in current task. While such a concurrent rehearsal strategy is widely used, it remains unclear if this approach is always optimal. Inspired by human learning, where sequentially revisiting tasks helps mitigate forgetting, we explore whether sequential rehearsal can offer greater benefits for CL compared to standard concurrent rehearsal. To address this question, we conduct a theoretical analysis of rehearsal-based CL in overparameterized linear models, comparing two strategies: 1) Concurrent Rehearsal, where past and new data are trained together, and 2) Sequential Rehearsal, where new data is trained first, followed by revisiting past data sequentially. By explicitly characterizing forgetting and generalization error, we show that sequential rehearsal performs better when tasks are less similar. These insights further motivate a novel Hybrid Rehearsal method, which trains similar tasks concurrently and revisits dissimilar tasks sequentially. We characterize its forgetting and generalization performance, and our experiments with deep neural networks further confirm that the hybrid approach outperforms standard concurrent rehearsal. This work provides the first comprehensive theoretical analysis of rehearsal-based CL.

View on arXiv
@article{deng2025_2506.00205,
  title={ Unlocking the Power of Rehearsal in Continual Learning: A Theoretical Perspective },
  author={ Junze Deng and Qinhang Wu and Peizhong Ju and Sen Lin and Yingbin Liang and Ness Shroff },
  journal={arXiv preprint arXiv:2506.00205},
  year={ 2025 }
}
Comments on this paper