40
0

Subgraph Gaussian Embedding Contrast for Self-Supervised Graph Representation Learning

Main:13 Pages
4 Figures
Bibliography:4 Pages
4 Tables
Appendix:1 Pages
Abstract

Graph Representation Learning (GRL) is a fundamental task in machine learning, aiming to encode high-dimensional graph-structured data into low-dimensional vectors. Self-Supervised Learning (SSL) methods are widely used in GRL because they can avoid expensive human annotation. In this work, we propose a novel Subgraph Gaussian Embedding Contrast (SubGEC) method. Our approach introduces a subgraph Gaussian embedding module, which adaptively maps subgraphs to a structured Gaussian space, ensuring the preservation of input subgraph characteristics while generating subgraphs with a controlled distribution. We then employ optimal transport distances, more precisely the Wasserstein and Gromov-Wasserstein distances, to effectively measure the similarity between subgraphs, enhancing the robustness of the contrastive learning process. Extensive experiments across multiple benchmarks demonstrate that \method~outperforms or presents competitive performance against state-of-the-art approaches. Our findings provide insights into the design of SSL methods for GRL, emphasizing the importance of the distribution of the generated contrastive pairs.

View on arXiv
@article{xie2025_2505.23529,
  title={ Subgraph Gaussian Embedding Contrast for Self-Supervised Graph Representation Learning },
  author={ Shifeng Xie and Aref Einizade and Jhony H. Giraldo },
  journal={arXiv preprint arXiv:2505.23529},
  year={ 2025 }
}
Comments on this paper