Pieceformer: Similarity-Driven Knowledge Transfer via Scalable Graph Transformer in VLSI

Accurate graph similarity is critical for knowledge transfer in VLSI design, enabling the reuse of prior solutions to reduce engineering effort and turnaround time. We propose Pieceformer, a scalable, self-supervised similarity assessment framework, equipped with a hybrid message-passing and graph transformer encoder. To address transformer scalability, we incorporate a linear transformer backbone and introduce a partitioned training pipeline for efficient memory and parallelism management. Evaluations on synthetic and real-world CircuitNet datasets show that Pieceformer reduces mean absolute error (MAE) by 24.9% over the baseline and is the only method to correctly cluster all real-world design groups. We further demonstrate the practical usage of our model through a case study on a partitioning task, achieving up to 89% runtime reduction. These results validate the framework's effectiveness for scalable, unbiased design reuse in modern VLSI systems.
View on arXiv@article{yang2025_2506.15907, title={ Pieceformer: Similarity-Driven Knowledge Transfer via Scalable Graph Transformer in VLSI }, author={ Hang Yang and Yusheng Hu and Yong Liu and Cong }, journal={arXiv preprint arXiv:2506.15907}, year={ 2025 } }