We consider a parsimonious model for fitting observation data with two-way dependencies; that is, we use the signal matrix to explain column-wise dependency in , and the measurement error matrix to explain its row-wise dependency. In the matrix normal setting, we have the following representation where follows the matrix variate normal distribution with the Kronecker Sum covariance structure: where , which is generalized to the subgaussian settings as follows. Suppose that we observe and in the following model: \begin{eqnarray*} y & = & X_0 \beta^* + \epsilon \\ X & = & X_0 + W \end{eqnarray*} where is a design matrix with independent subgaussian row vectors, is a noise vector and is a mean zero random noise matrix with independent subgaussian column vectors, independent of and . This model is significantly different from those analyzed in the literature. Under sparsity and restrictive eigenvalue type of conditions, we show that one is able to recover a sparse vector from the following model given a single observation matrix and the response vector . We establish consistency in estimating and obtain the rates of convergence in the norm, where for the Lasso-type estimator, and for for a Dantzig-type conic programming estimator.
View on arXiv