Gaussian processes (GPs) are crucial in machine learning for quantifying uncertainty in predictions. However, their associated covariance matrices, defined by kernel functions, are typically dense and large-scale, posing significant computational challenges. This paper introduces a matrix-free method that utilizes the Non-equispaced Fast Fourier Transform (NFFT) to achieve nearly linear complexity in the multiplication of kernel matrices and their derivatives with vectors for a predetermined accuracy level. To address high-dimensional problems, we propose an additive kernel approach. Each sub-kernel in this approach captures lower-order feature interactions, allowing for the efficient application of the NFFT method and potentially increasing accuracy across various real-world datasets. Additionally, we implement a preconditioning strategy that accelerates hyperparameter tuning, further improving the efficiency and effectiveness of GPs.
View on arXiv@article{wagner2025_2504.00480, title={ Preconditioned Additive Gaussian Processes with Fourier Acceleration }, author={ Theresa Wagner and Tianshi Xu and Franziska Nestler and Yuanzhe Xi and Martin Stoll }, journal={arXiv preprint arXiv:2504.00480}, year={ 2025 } }