ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.09205
26
0

HMamba: Hyperbolic Mamba for Sequential Recommendation

14 May 2025
Qianru Zhang
Honggang Wen
Wei Yuan
Crystal Chen
Menglin Yang
S. Yiu
Hongzhi Yin
    Mamba
ArXivPDFHTML
Abstract

Sequential recommendation systems have become a cornerstone of personalized services, adept at modeling the temporal evolution of user preferences by capturing dynamic interaction sequences. Existing approaches predominantly rely on traditional models, including RNNs and Transformers. Despite their success in local pattern recognition, Transformer-based methods suffer from quadratic computational complexity and a tendency toward superficial attention patterns, limiting their ability to infer enduring preference hierarchies in sequential recommendation data. Recent advances in Mamba-based sequential models introduce linear-time efficiency but remain constrained by Euclidean geometry, failing to leverage the intrinsic hyperbolic structure of recommendation data. To bridge this gap, we propose Hyperbolic Mamba, a novel architecture that unifies the efficiency of Mamba's selective state space mechanism with hyperbolic geometry's hierarchical representational power. Our framework introduces (1) a hyperbolic selective state space that maintains curvature-aware sequence modeling and (2) stabilized Riemannian operations to enable scalable training. Experiments across four benchmarks demonstrate that Hyperbolic Mamba achieves 3-11% improvement while retaining Mamba's linear-time efficiency, enabling real-world deployment. This work establishes a new paradigm for efficient, hierarchy-aware sequential modeling.

View on arXiv
@article{zhang2025_2505.09205,
  title={ HMamba: Hyperbolic Mamba for Sequential Recommendation },
  author={ Qianru Zhang and Honggang Wen and Wei Yuan and Crystal Chen and Menglin Yang and Siu-Ming Yiu and Hongzhi Yin },
  journal={arXiv preprint arXiv:2505.09205},
  year={ 2025 }
}
Comments on this paper