ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.08949
76
0

Self-Supervised Graph Contrastive Pretraining for Device-level Integrated Circuits

13 February 2025
Sungyoung Lee
Z. Wang
Seunggeun Kim
Taekyun Lee
David Z. Pan
    SSL
    GNN
ArXivPDFHTML
Abstract

Self-supervised graph representation learning has driven significant advancements in domains such as social network analysis, molecular design, and electronics design automation (EDA). However, prior works in EDA have mainly focused on the representation of gate-level digital circuits, failing to capture analog and mixed-signal circuits. To address this gap, we introduce DICE: Device-level Integrated Circuits Encoder, the first self-supervised pretrained graph neural network (GNN) model for any circuit expressed at the device level. DICE is a message-passing neural network (MPNN) trained through graph contrastive learning, and its pretraining process is simulation-free, incorporating two novel data augmentation techniques. Experimental results demonstrate that DICE achieves substantial performance gains across three downstream tasks, underscoring its effectiveness for both analog and digital circuits.

View on arXiv
@article{lee2025_2502.08949,
  title={ Self-Supervised Graph Contrastive Pretraining for Device-level Integrated Circuits },
  author={ Sungyoung Lee and Ziyi Wang and Seunggeun Kim and Taekyun Lee and David Z. Pan },
  journal={arXiv preprint arXiv:2502.08949},
  year={ 2025 }
}
Comments on this paper