ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.12937
51
0

Tuning Algorithmic and Architectural Hyperparameters in Graph-Based Semi-Supervised Learning with Provable Guarantees

18 February 2025
Ally Yalei Du
Eric Huang
Dravyansh Sharma
ArXivPDFHTML
Abstract

Graph-based semi-supervised learning is a powerful paradigm in machine learning for modeling and exploiting the underlying graph structure that captures the relationship between labeled and unlabeled data. A large number of classical as well as modern deep learning based algorithms have been proposed for this problem, often having tunable hyperparameters. We initiate a formal study of tuning algorithm hyperparameters from parameterized algorithm families for this problem. We obtain novel O(log⁡n)O(\log n)O(logn) pseudo-dimension upper bounds for hyperparameter selection in three classical label propagation-based algorithm families, where nnn is the number of nodes, implying bounds on the amount of data needed for learning provably good parameters. We further provide matching Ω(log⁡n)\Omega(\log n)Ω(logn) pseudo-dimension lower bounds, thus asymptotically characterizing the learning-theoretic complexity of the parameter tuning problem. We extend our study to selecting architectural hyperparameters in modern graph neural networks. We bound the Rademacher complexity for tuning the self-loop weighting in recently proposed Simplified Graph Convolution (SGC) networks. We further propose a tunable architecture that interpolates graph convolutional neural networks (GCN) and graph attention networks (GAT) in every layer, and provide Rademacher complexity bounds for tuning the interpolation coefficient.

View on arXiv
@article{du2025_2502.12937,
  title={ Tuning Algorithmic and Architectural Hyperparameters in Graph-Based Semi-Supervised Learning with Provable Guarantees },
  author={ Ally Yalei Du and Eric Huang and Dravyansh Sharma },
  journal={arXiv preprint arXiv:2502.12937},
  year={ 2025 }
}
Comments on this paper