Error Bound of Empirical Risk Minimization for Noisy Standard and Generalized Phase Retrieval Problems

In this paper, we study the estimation performance of empirical risk minimization (ERM) in noisy (standard) phase retrieval (NPR) given by , or noisy generalized phase retrieval (NGPR) formulated as , where is the desired signal, is the sample size, is the noise vector. We establish new error bounds under different noise patterns, and our proofs are valid for both and . In NPR under arbitrary noise vector , we derive a new error bound , which is tighter than the currently known one in many cases. In NGPR, we show for arbitrary . In both problems, the bounds for arbitrary noise immediately give rise to for sub-Gaussian or sub-exponential random noise, with some conventional but inessential assumptions (e.g., independent or zero-mean condition) removed or weakened. In addition, we make a first attempt to ERM under heavy-tailed random noise assumed to have bounded -th moment. To achieve a trade-off between bias and variance, we truncate the responses and propose a corresponding robust ERM estimator, which is shown to possess the guarantee in both NPR, NGPR. All the error bounds straightforwardly extend to the more general problems of rank- matrix recovery, and these results deliver a conclusion that the full-rank frame in NGPR is more robust to biased noise than the rank-1 frame in NPR. Extensive experimental results are presented to illustrate our theoretical findings.
View on arXiv