51
0
v1v2 (latest)

Towards Efficient Few-shot Graph Neural Architecture Search via Partitioning Gradient Contribution

Main:8 Pages
8 Figures
Bibliography:3 Pages
9 Tables
Appendix:1 Pages
Abstract

To address the weight coupling problem, certain studies introduced few-shot Neural Architecture Search (NAS) methods, which partition the supernet into multiple sub-supernets. However, these methods often suffer from computational inefficiency and tend to provide suboptimal partitioning schemes. To address this problem more effectively, we analyze the weight coupling problem from a novel perspective, which primarily stems from distinct modules in succeeding layers imposing conflicting gradient directions on the preceding layer modules. Based on this perspective, we propose the Gradient Contribution (GC) method that efficiently computes the cosine similarity of gradient directions among modules by decomposing the Vector-Jacobian Product during supernet backpropagation. Subsequently, the modules with conflicting gradient directions are allocated to distinct sub-supernets while similar ones are grouped together. To assess the advantages of GC and address the limitations of existing Graph Neural Architecture Search methods, which are limited to searching a single type of Graph Neural Networks (Message Passing Neural Networks (MPNNs) or Graph Transformers (GTs)), we propose the Unified Graph Neural Architecture Search (UGAS) framework, which explores optimal combinations of MPNNs and GTs. The experimental results demonstrate that GC achieves state-of-the-art (SOTA) performance in supernet partitioning quality and time efficiency. In addition, the architectures searched by UGAS+GC outperform both the manually designed GNNs and those obtained by existing NAS methods. Finally, ablation studies further demonstrate the effectiveness of all proposed methods.

View on arXiv
@article{song2025_2506.01231,
  title={ Towards Efficient Few-shot Graph Neural Architecture Search via Partitioning Gradient Contribution },
  author={ Wenhao Song and Xuan Wu and Bo Yang and You Zhou and Yubin Xiao and Yanchun Liang and Hongwei Ge and Heow Pueh Lee and Chunguo Wu },
  journal={arXiv preprint arXiv:2506.01231},
  year={ 2025 }
}
Comments on this paper