ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.12541
18
0

BSA: Ball Sparse Attention for Large-scale Geometries

14 June 2025
Catalin E. Brita
Hieu Nguyen
Lohithsai Yadala Chanchu
Domonkos Nagy
Maksim Zhdanov
ArXiv (abs)PDFHTML
Main:4 Pages
4 Figures
Bibliography:2 Pages
6 Tables
Appendix:1 Pages
Abstract

Self-attention scales quadratically with input size, limiting its use for large-scale physical systems. Although sparse attention mechanisms provide a viable alternative, they are primarily designed for regular structures such as text or images, making them inapplicable for irregular geometries. In this work, we present Ball Sparse Attention (BSA), which adapts Native Sparse Attention (NSA) (Yuan et al., 2025) to unordered point sets by imposing regularity using the Ball Tree structure from the Erwin Transformer (Zhdanov et al., 2025). We modify NSA's components to work with ball-based neighborhoods, yielding a global receptive field at sub-quadratic cost. On an airflow pressure prediction task, we achieve accuracy comparable to Full Attention while significantly reducing the theoretical computational complexity. Our implementation is available atthis https URL.

View on arXiv
@article{brita2025_2506.12541,
  title={ BSA: Ball Sparse Attention for Large-scale Geometries },
  author={ Catalin E. Brita and Hieu Nguyen and Lohithsai Yadala Chanchu and Domonkos Nagy and Maksim Zhdanov },
  journal={arXiv preprint arXiv:2506.12541},
  year={ 2025 }
}
Comments on this paper