Continual Learning with Columnar Spiking Neural Networks

Main:10 Pages
4 Figures
Bibliography:2 Pages
10 Tables
Abstract
This study investigates columnar-organized spiking neural networks (SNNs) for continual learning and catastrophic forgetting. Using CoLaNET (Columnar Layered Network), we show that microcolumns adapt most efficiently to new tasks when they lack shared structure with prior learning. We demonstrate how CoLaNET hyperparameters govern the trade-off between retaining old knowledge (stability) and acquiring new information (plasticity). Our optimal configuration learns ten sequential MNIST tasks effectively, maintaining 92% accuracy on each. It shows low forgetting, with only 4% performance degradation on the first task after training on nine subsequent tasks.
View on arXiv@article{larionov2025_2506.17169, title={ Continual Learning with Columnar Spiking Neural Networks }, author={ Denis Larionov and Nikolay Bazenkov and Mikhail Kiselev }, journal={arXiv preprint arXiv:2506.17169}, year={ 2025 } }
Comments on this paper