ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.18886
54
1

On Pruning State-Space LLMs

26 February 2025
Tamer Ghattas
Michael Hassid
Roy Schwartz
ArXivPDFHTML
Abstract

Recent work proposed state-space models (SSMs) as an efficient alternative to transformer-based LLMs. Can these models be pruned to further reduce their computation costs? We adapt several pruning methods to the SSM structure, and apply them to four SSM-based LLMs across multiple tasks. We find that such models are quite robust to some pruning methods (e.g. WANDA), while using other methods lead to fast performance degradation.

View on arXiv
@article{ghattas2025_2502.18886,
  title={ On Pruning State-Space LLMs },
  author={ Tamer Ghattas and Michael Hassid and Roy Schwartz },
  journal={arXiv preprint arXiv:2502.18886},
  year={ 2025 }
}
Comments on this paper