ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.07854
12
1

Residual Reweighted Conformal Prediction for Graph Neural Networks

9 June 2025
Zheng Zhang
Jie Bao
Zhixin Zhou
Nicolo Colombo
Lixin Cheng
Rui Luo
ArXiv (abs)PDFHTML
Main:8 Pages
8 Figures
Bibliography:4 Pages
12 Tables
Appendix:6 Pages
Abstract

Graph Neural Networks (GNNs) excel at modeling relational data but face significant challenges in high-stakes domains due to unquantified uncertainty. Conformal prediction (CP) offers statistical coverage guarantees, but existing methods often produce overly conservative prediction intervals that fail to account for graph heteroscedasticity and structural biases. While residual reweighting CP variants address some of these limitations, they neglect graph topology, cluster-specific uncertainties, and risk data leakage by reusing training sets. To address these issues, we propose Residual Reweighted GNN (RR-GNN), a framework designed to generate minimal prediction sets with provable marginal coverage guarantees.RR-GNN introduces three major innovations to enhance prediction performance. First, it employs Graph-Structured Mondrian CP to partition nodes or edges into communities based on topological features, ensuring cluster-conditional coverage that reflects heterogeneity. Second, it uses Residual-Adaptive Nonconformity Scores by training a secondary GNN on a held-out calibration set to estimate task-specific residuals, dynamically adjusting prediction intervals according to node or edge uncertainty. Third, it adopts a Cross-Training Protocol, which alternates the optimization of the primary GNN and the residual predictor to prevent information leakage while maintaining graph dependencies. We validate RR-GNN on 15 real-world graphs across diverse tasks, including node classification, regression, and edge weight prediction. Compared to CP baselines, RR-GNN achieves improved efficiency over state-of-the-art methods, with no loss of coverage.

View on arXiv
@article{zhang2025_2506.07854,
  title={ Residual Reweighted Conformal Prediction for Graph Neural Networks },
  author={ Zheng Zhang and Jie Bao and Zhixin Zhou and Nicolo Colombo and Lixin Cheng and Rui Luo },
  journal={arXiv preprint arXiv:2506.07854},
  year={ 2025 }
}
Comments on this paper