A General Theory for Kernel Packets: from state space model to compactly supported basis

It is well known that the state space (SS) model formulation of a Gaussian process (GP) can lower its training and prediction time both to for data points. We prove that an -dimensional SS model formulation of GP is equivalent to a concept we introduce as the general right Kernel Packet (KP): a transformation for the GP covariance such that holds for any , 0 , and consecutive points , where denotes -th derivative acting on . We extend this idea to the backward SS model formulation, leading to the left KP for next consecutive points: for any . By combining both left and right KPs, we can prove that a suitable linear combination of these covariance functions yields KP functions compactly supported on . KPs improve GP prediction time to or , enable broader applications including GP's derivatives and kernel multiplications, and can be generalized to multi-dimensional additive and product kernels for scattered data.
View on arXiv