42
0

Class Incremental Learning for Algorithm Selection

Main:3 Pages
2 Figures
Bibliography:1 Pages
3 Tables
Abstract

Algorithm selection is commonly used to predict the best solver from a portfolio per per-instance. In many real scenarios, instances arrive in a stream: new instances become available over time, while the number of class labels can also grow as new data distributions arrive downstream. As a result, the classification model needs to be periodically updated to reflect additional solvers without catastrophic forgetting of past data. In machine-learning (ML), this is referred to as Class Incremental Learning (CIL). While commonly addressed in ML settings, its relevance to algorithm-selection in optimisation has not been previously studied. Using a bin-packing dataset, we benchmark 8 continual learning methods with respect to their ability to withstand catastrophic forgetting. We find that rehearsal-based methods significantly outperform other CIL methods. While there is evidence of forgetting, the loss is small at around 7%. Hence, these methods appear to be a viable approach to continual learning in streaming optimisation scenarios.

View on arXiv
@article{nemeth2025_2506.01545,
  title={ Class Incremental Learning for Algorithm Selection },
  author={ Mate Botond Nemeth and Emma Hart and Kevin Sim and Quentin Renau },
  journal={arXiv preprint arXiv:2506.01545},
  year={ 2025 }
}
Comments on this paper