Near Optimal Sketching of Low-Rank Tensor Regression

We study the least squares regression problem \begin{align*} \min_{\Theta \in \mathcal{S}_{\odot D,R}} \|A\Theta-b\|_2, \end{align*} where is the set of for which for vectors for all and , and denotes the outer product of vectors. That is, is a low-dimensional, low-rank tensor. This is motivated by the fact that the number of parameters in is only , which is significantly smaller than the number of parameters in ordinary least squares regression. We consider the above CP decomposition model of tensors , as well as the Tucker decomposition. For both models we show how to apply data dimensionality reduction techniques based on {\it sparse} random projections , with , to reduce the problem to a much smaller problem , for which if is a near-optimum to the smaller problem, then it is also a near optimum to the original problem. We obtain significantly smaller dimension and sparsity in than is possible for ordinary least squares regression, and we also provide a number of numerical simulations supporting our theory.
View on arXiv