24
0

EntryPrune: Neural Network Feature Selection using First Impressions

Abstract

There is an ongoing effort to develop feature selection algorithms to improve interpretability, reduce computational resources, and minimize overfitting in predictive models. Neural networks stand out as architectures on which to build feature selection methods, and recently, neuron pruning and regrowth have emerged from the sparse neural network literature as promising new tools. We introduce EntryPrune, a novel supervised feature selection algorithm using a dense neural network with a dynamic sparse input layer. It employs entry-based pruning, a novel approach that compares neurons based on their relative change induced when they have entered the network. Extensive experiments on 13 different datasets show that our approach generally outperforms the current state-of-the-art methods, and in particular improves the average accuracy on low-dimensional datasets. Furthermore, we show that EntryPruning surpasses traditional techniques such as magnitude pruning within the EntryPrune framework and that EntryPrune achieves lower runtime than competing approaches. Our code is available atthis https URL.

View on arXiv
@article{zimmer2025_2410.02344,
  title={ EntryPrune: Neural Network Feature Selection using First Impressions },
  author={ Felix Zimmer and Patrik Okanovic and Torsten Hoefler },
  journal={arXiv preprint arXiv:2410.02344},
  year={ 2025 }
}
Comments on this paper