ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.06883
29
3

Degree-Conscious Spiking Graph for Cross-Domain Adaptation

9 October 2024
Yingxu Wang
Mengzhu Wang
Siwei Liu
Shangsong Liang
Nan Yin
James Kwok
ArXivPDFHTML
Abstract

Spiking Graph Networks (SGNs) have demonstrated significant potential in graph classification by emulating brain-inspired neural dynamics to achieve energy-efficient computation. However, existing SGNs are generally constrained to in-distribution scenarios and struggle with distribution shifts. In this paper, we first propose the domain adaptation problem in SGNs, and introduce a novel framework named Degree-Consicious Spiking Graph for Cross-Domain Adaptation. DeSGraDA enhances generalization across domains with three key components. First, we introduce the degree-conscious spiking representation module by adapting spike thresholds based on node degrees, enabling more expressive and structure-aware signal encoding. Then, we perform temporal distribution alignment by adversarially matching membrane potentials between domains, ensuring effective performance under domain shift while preserving energy efficiency. Additionally, we extract consistent predictions across two spaces to create reliable pseudo-labels, effectively leveraging unlabeled data to enhance graph classification performance. Furthermore, we establish the first generalization bound for SGDA, providing theoretical insights into its adaptation performance. Extensive experiments on benchmark datasets validate that DeSGraDA consistently outperforms state-of-the-art methods in both classification accuracy and energy efficiency.

View on arXiv
@article{wang2025_2410.06883,
  title={ Degree-Conscious Spiking Graph for Cross-Domain Adaptation },
  author={ Yingxu Wang and Mengzhu Wang and Siwei Liu and Houcheng Su and Nan Yin and James Kwok },
  journal={arXiv preprint arXiv:2410.06883},
  year={ 2025 }
}
Comments on this paper