10
0

DPFormer: Dynamic Prompt Transformer for Continual Learning

Main:9 Pages
2 Figures
Bibliography:2 Pages
Appendix:1 Pages
Abstract

In continual learning, solving the catastrophic forgetting problem may make the models fall into the stability-plasticity dilemma. Moreover, inter-task confusion will also occur due to the lack of knowledge exchanges between different tasks. In order to solve the aforementioned problems, we propose a novel dynamic prompt transformer (DPFormer) with prompt schemes. The prompt schemes help the DPFormer memorize learned knowledge of previous classes and tasks, and keep on learning new knowledge from new classes and tasks under a single network structure with a nearly fixed number of model parameters. Moreover, they also provide discrepant information to represent different tasks to solve the inter-task confusion problem. Based on prompt schemes, a unified classification module with the binary cross entropy loss, the knowledge distillation loss and the auxiliary loss is proposed to train the whole model in an end-to-end trainable manner. Compared with state-of-the-art methods, our method achieves the best performance in the CIFAR-100, ImageNet100 and ImageNet1K datasets under different class-incremental settings in continual learning. The source code will be available at our GitHub after acceptance.

View on arXiv
@article{huang2025_2506.07414,
  title={ DPFormer: Dynamic Prompt Transformer for Continual Learning },
  author={ Sheng-Kai Huang and Jiun-Feng Chang and Chun-Rong Huang },
  journal={arXiv preprint arXiv:2506.07414},
  year={ 2025 }
}
Comments on this paper