Gaussian Approximation of Quantization Error for Estimation from Compressed Data

We consider the distributional connection between the lossy compressed representation of a high-dimensional signal using a random spherical code and the observation of under an additive white Gaussian noise (AWGN). We show that the Wasserstein distance between a bitrate- compressed version of and its observation under an AWGN-channel of signal-to-noise ratio is sub-linear in the problem dimension. We utilize this fact to connect the risk of an estimator based on an AWGN-corrupted version of to the risk attained by the same estimator when fed with its bitrate- quantized version. We demonstrate the usefulness of this connection by deriving various novel results for inference problems under compression constraints, including minimax estimation, sparse regression, compressed sensing, and the universality of linear estimation in remote source coding.
View on arXiv