23
9

Adaptive Reduced Rank Regression

Abstract

We study the low rank regression problem \my=M\mx+ϵ\my = M\mx + \epsilon, where \mx\mx and \my\my are d1d_1 and d2d_2 dimensional vectors respectively. We consider the extreme high-dimensional setting where the number of observations nn is less than d1+d2d_1 + d_2. Existing algorithms are designed for settings where nn is typically as large as \Rank(M)(d1+d2)\Rank(M)(d_1+d_2). This work provides an efficient algorithm which only involves two SVD, and establishes statistical guarantees on its performance. The algorithm decouples the problem by first estimating the precision matrix of the features, and then solving the matrix denoising problem. To complement the upper bound, we introduce new techniques for establishing lower bounds on the performance of any algorithm for this problem. Our preliminary experiments confirm that our algorithm often out-performs existing baselines, and is always at least competitive.

View on arXiv
Comments on this paper