CodeSSM: Towards State Space Models for Code Understanding
- Mamba

Although transformers are widely used for various code-specific tasks, they have some significant limitations. In this paper, we investigate State Space Models (SSMs) as a potential alternative to transformers for code understanding tasks, such as code retrieval, classification, and clone detection. Previous research has already demonstrated that SSMs are more compute-efficient than transformers. In our work, we show that SSMs are also more sample-efficient and can effectively extrapolate to longer contexts (beyond the pretraining context) during fine-tuning. Through comprehensive experiments, we demonstrate that SSMs could serve as a viable alternative to transformers for code understanding tasks, while addressing some of the major limitations associated with transformers.
View on arXiv@article{verma2025_2505.01475, title={ CodeSSM: Towards State Space Models for Code Understanding }, author={ Shweta Verma and Abhinav Anand and Mira Mezini }, journal={arXiv preprint arXiv:2505.01475}, year={ 2025 } }