ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.05290
123
2
v1v2v3v4 (latest)

Removing Structured Noise with Diffusion Models

20 January 2023
Tristan S. W. Stevens
Hans van Gorp
F. C. Meral
Junseob Shin
Jason Yu
Jean-Luc Robert
Ruud J. G. van Sloun
    DiffM
ArXiv (abs)PDFHTML
Abstract

Solving ill-posed inverse problems requires careful formulation of prior beliefs over the signals of interest and an accurate description of their manifestation into noisy measurements. Handcrafted signal priors based on e.g. sparsity are increasingly replaced by data-driven deep generative models, and several groups have recently shown that state-of-the-art score-based diffusion models yield particularly strong performance and flexibility. In this paper, we show that the powerful paradigm of posterior sampling with diffusion models can be extended to include rich, structured, noise models. To that end, we propose a joint conditional reverse diffusion process with learned scores for the noise and signal-generating distribution. We demonstrate strong performance gains across various inverse problems with structured noise, outperforming competitive baselines that use normalizing flows and adversarial networks. This opens up new opportunities and relevant practical applications of diffusion modeling for inverse problems in the context of non-Gaussian measurement models.

View on arXiv
@article{stevens2025_2302.05290,
  title={ Removing Structured Noise with Diffusion Models },
  author={ Tristan S.W. Stevens and Hans van Gorp and Faik C. Meral and Junseob Shin and Jason Yu and Jean-Luc Robert and Ruud J.G. van Sloun },
  journal={arXiv preprint arXiv:2302.05290},
  year={ 2025 }
}
Comments on this paper