613

Scalable Gaussian Processes, with Guarantees: Kernel Approximations and Deep Feature Extraction

AAAI Conference on Artificial Intelligence (AAAI), 2020
Abstract

We provide approximation guarantees for a linear-time inferential framework for Gaussian processes, using two low-rank kernel approximations based on random Fourier features and truncation of Mercer expansions. In particular, we bound the Kullback-Leibler divergence between the idealized Gaussian process and the one resulting from a low-rank approximation to its kernel. Additionally, we present strong evidence that these two approximations, enhanced by an initial automatic feature extraction through deep neural networks, outperform a broad range of state-of-the-art methods in terms of time efficiency, negative log-predictive density, and root mean squared error.

View on arXiv
Comments on this paper