51
18

String Gaussian Processes

Abstract

In this paper we introduce a novel framework for making exact nonparametric Bayesian inference on latent functions, that is particularly suitable for Big Data tasks. Firstly, we introduce a class of stochastic processes we refer to as string Gaussian processes (string GPs). The local nature of the construction of string GPs provides a principled insight into making exact, scalable, distributed, and flexible nonparametric Bayesian inference. Moreover, string GPs provide a flexible framework for building nonstationary functional priors from popular kernels, allowing for arbitrarily complex local patterns in the data, while ensuring some mild global regularity constraints. Furthermore, string GP priors naturally cope with heterogeneous input data, and the gradient of the learned latent function is readily available for explanatory analysis. Secondly, we provide some theoretical results relating our approach to the standard GP paradigm. In particular, we prove that some string GPs are Gaussian processes, which provides a complementary global perspective on our framework. Finally, we derive a scalable and distributed MCMC scheme for supervised learning tasks under string GP priors. The proposed MCMC scheme has computational time complexity O(n)\mathcal{O}(n) and memory requirement O(dn)\mathcal{O}(dn), where nn is the data size and dd the dimension of the input space.

View on arXiv
Comments on this paper