Adaptive Reduced Rank Regression

We study the low rank regression problem , where and are and dimensional vectors respectively. We consider the extreme high-dimensional setting where the number of observations is less than . Existing algorithms are designed for settings where is typically as large as . This work provides an efficient algorithm which only involves two SVD, and establishes statistical guarantees on its performance. The algorithm decouples the problem by first estimating the precision matrix of the features, and then solving the matrix denoising problem. To complement the upper bound, we introduce new techniques for establishing lower bounds on the performance of any algorithm for this problem. Our preliminary experiments confirm that our algorithm often out-performs existing baselines, and is always at least competitive.
View on arXiv