2
0

Flash Invariant Point Attention

Abstract

Invariant Point Attention (IPA) is a key algorithm for geometry-aware modeling in structural biology, central to many protein and RNA models. However, its quadratic complexity limits the input sequence length. We introduce FlashIPA, a factorized reformulation of IPA that leverages hardware-efficient FlashAttention to achieve linear scaling in GPU memory and wall-clock time with sequence length. FlashIPA matches or exceeds standard IPA performance while substantially reducing computational costs. FlashIPA extends training to previously unattainable lengths, and we demonstrate this by re-training generative models without length restrictions and generating structures of thousands of residues. FlashIPA is available atthis https URL.

View on arXiv
@article{liu2025_2505.11580,
  title={ Flash Invariant Point Attention },
  author={ Andrew Liu and Axel Elaldi and Nicholas T Franklin and Nathan Russell and Gurinder S Atwal and Yih-En A Ban and Olivia Viessmann },
  journal={arXiv preprint arXiv:2505.11580},
  year={ 2025 }
}
Comments on this paper