ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.08097
82
0

Evidential Uncertainty Probes for Graph Neural Networks

11 March 2025
Linlin Yu
Kangshuo Li
Pritom Kumar Saha
Yifei Lou
Feng Chen
    EDL
    UQCV
ArXivPDFHTML
Abstract

Accurate quantification of both aleatoric and epistemic uncertainties is essential when deploying Graph Neural Networks (GNNs) in high-stakes applications such as drug discovery and financial fraud detection, where reliable predictions are critical. Although Evidential Deep Learning (EDL) efficiently quantifies uncertainty using a Dirichlet distribution over predictive probabilities, existing EDL-based GNN (EGNN) models require modifications to the network architecture and retraining, failing to take advantage of pre-trained models. We propose a plug-and-play framework for uncertainty quantification in GNNs that works with pre-trained models without the need for retraining. Our Evidential Probing Network (EPN) uses a lightweight Multi-Layer-Perceptron (MLP) head to extract evidence from learned representations, allowing efficient integration with various GNN architectures. We further introduce evidence-based regularization techniques, referred to as EPN-reg, to enhance the estimation of epistemic uncertainty with theoretical justifications. Extensive experiments demonstrate that the proposed EPN-reg achieves state-of-the-art performance in accurate and efficient uncertainty quantification, making it suitable for real-world deployment.

View on arXiv
@article{yu2025_2503.08097,
  title={ Evidential Uncertainty Probes for Graph Neural Networks },
  author={ Linlin Yu and Kangshuo Li and Pritom Kumar Saha and Yifei Lou and Feng Chen },
  journal={arXiv preprint arXiv:2503.08097},
  year={ 2025 }
}
Comments on this paper