ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.17932
62
0

Selection Mechanisms for Sequence Modeling using Linear State Space Models

23 May 2025
Umberto Casti
Sandro Zampieri
Fabio Pasqualetti
    Mamba
ArXivPDFHTML
Abstract

Recent advancements in language modeling tasks have been driven by architectures such as Transformers and, more recently, by Selective State Space Models (SSMs). In this paper, we introduce an alternative selection mechanism inspired by control theory methodologies. Specifically, we propose a novel residual generator for selection, drawing an analogy to fault detection strategies in Linear Time-Invariant (LTI) systems. Unlike Mamba, which utilizes Linear Time-Varying (LTV) systems, our approach combines multiple LTI systems, preserving their beneficial properties during training while achieving comparable selectivity. To evaluate the effectiveness of the proposed architecture, we test its performance on synthetic tasks. While these tasks are not inherently critical, they serve as benchmarks to test the selectivity properties of different cores architecture. This work highlights the potential of integrating theoretical insights with experimental advancements, offering a complementary perspective to deep learning innovations at the intersection of control theory and machine learning.

View on arXiv
@article{casti2025_2505.17932,
  title={ Selection Mechanisms for Sequence Modeling using Linear State Space Models },
  author={ Umberto Casti and Sandro Zampieri and Fabio Pasqualetti },
  journal={arXiv preprint arXiv:2505.17932},
  year={ 2025 }
}
Comments on this paper