11
1

A General Theory for Kernel Packets: from state space model to compactly supported basis

Liang Ding
Rui Tuo
Abstract

It is well known that the state space (SS) model formulation of a Gaussian process (GP) can lower its training and prediction time both to \CalO(n)\CalO(n) for nn data points. We prove that an mm-dimensional SS model formulation of GP is equivalent to a concept we introduce as the general right Kernel Packet (KP): a transformation for the GP covariance KK such that i=0maiDt(j)K(t,ti)=0\sum_{i=0}^{m}a_iD_t^{(j)}K(t,t_i)=0 holds for any tt1t \leq t_1, 0 jm1\leq j \leq m-1, and m+1m+1 consecutive points tit_i, where Dt(j)f(t){D}_t^{(j)}f(t) denotes jj-th derivative acting on tt. We extend this idea to the backward SS model formulation, leading to the left KP for next mm consecutive points: i=0mbiDt(j)K(t,tm+i)=0\sum_{i=0}^{m}b_i{D}_t^{(j)}K(t,t_{m+i})=0 for any tt2mt\geq t_{2m}. By combining both left and right KPs, we can prove that a suitable linear combination of these covariance functions yields mm KP functions compactly supported on (t0,t2m)(t_0,t_{2m}). KPs improve GP prediction time to O(logn)\mathcal{O}(\log n) or O(1)\mathcal{O}(1), enable broader applications including GP's derivatives and kernel multiplications, and can be generalized to multi-dimensional additive and product kernels for scattered data.

View on arXiv
Comments on this paper