Near-optimal sample complexity for convex tensor completion

We analyze low rank tensor completion (TC) using noisy measurements of a subset of the tensor. Assuming a rank-, order-, tensor where , the best sampling complexity that was achieved is , which is obtained by solving a tensor nuclear-norm minimization problem. However, this bound is significantly larger than the number of free variables in a low rank tensor which is . In this paper, we show that by using an atomic-norm whose atoms are rank- sign tensors, one can obtain a sample complexity of . Moreover, we generalize the matrix max-norm definition to tensors, which results in a max-quasi-norm (max-qnorm) whose unit ball has small Rademacher complexity. We prove that solving a constrained least squares estimation using either the convex atomic-norm or the nonconvex max-qnorm results in optimal sample complexity for the problem of low-rank tensor completion. Furthermore, we show that these bounds are nearly minimax rate-optimal. We also provide promising numerical results for max-qnorm constrained tensor completion, showing improved recovery results compared to matricization and alternating least squares.
View on arXiv