SDP-CROWN: Efficient Bound Propagation for Neural Network Verification with Tightness of Semidefinite Programming

Neural network verifiers based on linear bound propagation scale impressively to massive models but can be surprisingly loose when neuron coupling is crucial. Conversely, semidefinite programming (SDP) verifiers capture inter-neuron coupling naturally, but their cubic complexity restricts them to only small models. In this paper, we propose SDP-CROWN, a novel hybrid verification framework that combines the tightness of SDP relaxations with the scalability of bound-propagation verifiers. At the core of SDP-CROWN is a new linear bound, derived via SDP principles, that explicitly captures -norm-based inter-neuron coupling while adding only one extra parameter per layer. This bound can be integrated seamlessly into any linear bound-propagation pipeline, preserving the inherent scalability of such methods yet significantly improving tightness. In theory, we prove that our inter-neuron bound can be up to a factor of tighter than traditional per-neuron bounds. In practice, when incorporated into the state-of-the-art -CROWN verifier, we observe markedly improved verification performance on large models with up to 65 thousand neurons and 2.47 million parameters, achieving tightness that approaches that of costly SDP-based methods.
View on arXiv@article{chiu2025_2506.06665, title={ SDP-CROWN: Efficient Bound Propagation for Neural Network Verification with Tightness of Semidefinite Programming }, author={ Hong-Ming Chiu and Hao Chen and Huan Zhang and Richard Y. Zhang }, journal={arXiv preprint arXiv:2506.06665}, year={ 2025 } }