In recent years, graph neural networks (GNNs) have gained significant attention for node classification tasks on graph-structured data. However, traditional GNNs primarily focus on adjacency relationships between nodes, often overlooking the rich role-based characteristics that are crucial for learning more expressive node representations. Existing methods for capturing role-based features are largely unsupervised and fail to achieve optimal performance in downstream tasks. To address these limitations, we propose a novel hypergraph neural network with state space model (HGMN) that effectively integrates role-aware representations into GNNs and the state space model. HGMN utilizes hypergraph construction techniques to model higher-order relationships and combines role-based and adjacency-based representations through a learnable mamba transformer mechanism. By leveraging two distinct hypergraph construction methods-based on node degree and neighborhood levels, it strengthens the connections among nodes with similar roles, enhancing the model's representational power. Additionally, the inclusion of hypergraph convolution layers enables the model to capture complex dependencies within hypergraph structures. To mitigate the over-smoothing problem inherent in deep GNNs, we incorporate a residual network, ensuring improved stability and better feature propagation across layers. Extensive experiments conducted on one newly introduced dataset and four benchmark datasets demonstrate the superiority of HGMN. The model achieves significant performance improvements on node classification tasks compared to state-of-the-art GNN methods. These results highlight HGMN's ability to provide enriched node representations by effectively embedding role-based features alongside adjacency information, making it a versatile and powerful tool for a variety of graph-based learning applications.
View on arXiv