617

Scalable Gaussian Processes, with Guarantees: Kernel Approximations and Deep Feature Extraction

AAAI Conference on Artificial Intelligence (AAAI), 2020
Abstract

We provide a linear time inferential framework for Gaussian processes that supports automatic feature extraction through deep neural networks and low-rank kernel approximations. Importantly, we derive approximation guarantees bounding the Kullback-Leibler divergence between the idealized Gaussian process and one resulting from a low-rank approximation to its kernel under two types of approximations, which result in two instantiations of our framework: Deep Fourier Gaussian Processes, resulting from random Fourier feature low-rank approximations, and Deep Mercer Gaussian Processes, resulting from truncating the Mercer expansion of the kernel. We do extensive experimental evaluation of these two instantiations in a broad collection of real-world datasets providing strong evidence that they outperform a broad range of state-of-the-art methods in terms of time efficiency, negative log-predictive density, and root mean squared error.

View on arXiv
Comments on this paper