69
12

Sparse random tensors: concentration, regularization and applications

Abstract

We prove a non-asymptotic concentration inequality of sparse inhomogeneous random tensors under the spectral norm. For an order-kk inhomogeneous random tensor TT with sparsity pmaxclognnp_{\max}\geq \frac{c\log n}{n }, we show that TET=O(npmaxlogk2(n))\|T-\mathbb E T\|=O(\sqrt{n p_{\max}}\log^{k-2}(n)) with high probability. The optimality of this bound up to polylog factors is provided by an information theoretic lower bound. By tensor matricization, we extend the range of sparsity to pmaxclognnk1p_{\max}\geq \frac{c\log n}{n^{k-1}} and obtain TET=O(nk1pmax)\|T-\mathbb E T\|=O(\sqrt{n^{k-1} p_{\max}}) with high probability. We also provide a simple way to regularize TT such that O(nk1pmax)O(\sqrt{n^{k-1}p_{\max}}) concentration still holds down to sparsity pmaxcnk1p_{\max}\geq \frac{c}{n^{k-1}}. We present our concentration and regularization results with two applications: (i) a randomized construction of hypergraphs of bounded degrees with good expander mixing properties, (ii) concentration of sparsified tensors under uniform sampling.

View on arXiv
Comments on this paper