11
9

Leverage Score Sampling for Tensor Product Matrices in Input Sparsity Time

Abstract

We propose an input sparsity time sampling algorithm that can spectrally approximate the Gram matrix corresponding to the qq-fold column-wise tensor product of qq matrices using a nearly optimal number of samples, improving upon all previously known methods by poly(q)(q) factors. Furthermore, for the important special case of the qq-fold self-tensoring of a dataset, which is the feature matrix of the degree-qq polynomial kernel, the leading term of our method's runtime is proportional to the size of the input dataset and has no dependence on qq. Previous techniques either incur poly(q)(q) slowdowns in their runtime or remove the dependence on qq at the expense of having sub-optimal target dimension, and depend quadratically on the number of data-points in their runtime. Our sampling technique relies on a collection of qq partially correlated random projections which can be simultaneously applied to a dataset XX in total time that only depends on the size of XX, and at the same time their qq-fold Kronecker product acts as a near-isometry for any fixed vector in the column span of XqX^{\otimes q}. We also show that our sampling methods generalize to other classes of kernels beyond polynomial, such as Gaussian and Neural Tangent kernels.

View on arXiv
Comments on this paper