Towards Non-Euclidean Foundation Models: Advancing AI Beyond Euclidean Frameworks

In the era of foundation models and Large Language Models (LLMs), Euclidean space is the de facto geometric setting of our machine learning architectures. However, recent literature has demonstrated that this choice comes with fundamental limitations. To that end, non-Euclidean learning is quickly gaining traction, particularly in web-related applications where complex relationships and structures are prevalent. Non-Euclidean spaces, such as hyperbolic, spherical, and mixed-curvature spaces, have been shown to provide more efficient and effective representations for data with intrinsic geometric properties, including web-related data like social network topology, query-document relationships, and user-item interactions. Integrating foundation models with non-Euclidean geometries has great potential to enhance their ability to capture and model the underlying structures, leading to better performance in search, recommendations, and content understanding. This workshop focuses on the intersection of Non-Euclidean Foundation Models and Geometric Learning (NEGEL), exploring its potential benefits, including the potential benefits for advancing web-related technologies, challenges, and future directions. Workshop page: [this https URL](this https URL)
View on arXiv@article{yang2025_2505.14417, title={ Towards Non-Euclidean Foundation Models: Advancing AI Beyond Euclidean Frameworks }, author={ Menglin Yang and Yifei Zhang and Jialin Chen and Melanie Weber and Rex Ying }, journal={arXiv preprint arXiv:2505.14417}, year={ 2025 } }