ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.12415
12
26

The Generalized Lasso with Nonlinear Observations and Generative Priors

22 June 2020
Zhaoqiang Liu
Jonathan Scarlett
ArXivPDFHTML
Abstract

In this paper, we study the problem of signal estimation from noisy non-linear measurements when the unknown nnn-dimensional signal is in the range of an LLL-Lipschitz continuous generative model with bounded kkk-dimensional inputs. We make the assumption of sub-Gaussian measurements, which is satisfied by a wide range of measurement models, such as linear, logistic, 1-bit, and other quantized models. In addition, we consider the impact of adversarial corruptions on these measurements. Our analysis is based on a generalized Lasso approach (Plan and Vershynin, 2016). We first provide a non-uniform recovery guarantee, which states that under i.i.d.~Gaussian measurements, roughly O(kϵ2log⁡L)O\left(\frac{k}{\epsilon^2}\log L\right)O(ϵ2k​logL) samples suffice for recovery with an ℓ2\ell_2ℓ2​-error of ϵ\epsilonϵ, and that this scheme is robust to adversarial noise. Then, we apply this result to neural network generative models, and discuss various extensions to other models and non-i.i.d.~measurements. Moreover, we show that our result can be extended to the uniform recovery guarantee under the assumption of a so-called local embedding property, which is satisfied by the 1-bit and censored Tobit models.

View on arXiv
Comments on this paper