51
18

String and Membrane Gaussian Processes

Abstract

In this paper we introduce a novel framework for making exact nonparametric Bayesian inference on latent functions, that is particularly suitable for \emph{Big Data} tasks. Firstly, we introduce a class of stochastic processes we refer to as \emph{string Gaussian processes} (\emph{string GPs}), which are not to be mistaken for Gaussian processes operating on text. We construct \emph{string GPs} so that their finite-dimensional marginals exhibit suitable \emph{local} conditional independence structures, which allow for \emph{scalable}, \emph{distributed}, and \emph{flexible} nonparametric Bayesian inference, without resorting to approximations. Moreover, \emph{string GPs} provide a flexible framework for building nonstationary functional priors from popular kernels, allowing for arbitrarily complex local patterns in the data, while ensuring some mild global regularity constraints. Furthermore, \emph{string GP} priors naturally cope with heterogeneous input data, and the gradient of the learned latent function is readily available for explanatory analysis. Secondly, we provide some theoretical results relating our approach to the \emph{standard GP paradigm}. In particular, we prove that some \emph{string GPs} are Gaussian processes, which provides a complementary \emph{global} perspective on our framework. Finally, we derive a scalable and distributed MCMC scheme for supervised learning tasks under \emph{string GP} priors. The proposed MCMC scheme has computational time complexity O(N)\mathcal{O}(N) and memory requirement O(dN)\mathcal{O}(dN), where NN is the data size and dd the dimension of the input space. We illustrate the efficacy of the proposed approach on several synthetic and real-world datasets, including a dataset with 66 millions input points and 88 attributes.

View on arXiv
Comments on this paper