211

Leverage Score Sampling for Tensor Product Matrices in Input Sparsity Time

International Conference on Machine Learning (ICML), 2022
Abstract

We give an input sparsity time sampling algorithm for spectrally approximating the Gram matrix corresponding to the qq-fold column-wise tensor product of qq matrices using a nearly optimal number of samples, improving upon all previously known methods by poly(q)(q) factors. Furthermore, for the important special care of the qq-fold self-tensoring of a dataset, which is the feature matrix of the degree-qq polynomial kernel, the leading term of our method's runtime is proportional to the size of the dataset and has no dependence on qq. Previous techniques either incur a poly(q)(q) factor slowdown in their runtime or remove the dependence on qq at the expense of having sub-optimal target dimension, and depend quadratically on the number of data-points in their runtime. Our sampling technique relies on a collection of qq partially correlated random projections which can be simultaneously applied to a dataset XX in total time that only depends on the size of XX, and at the same time their qq-fold Kronecker product acts as a near-isometry for any fixed vector in the column span of XqX^{\otimes q}. We show that our sampling methods generalize to other classes of kernels beyond polynomial, such as Gaussian and Neural Tangent kernels.

View on arXiv
Comments on this paper