ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1507.01772
34
22

Posterior consistency and convergence rates for Bayesian inversion with hypoelliptic operators

7 July 2015
Hanne Kekkonen
Matti Lassas
S. Siltanen
ArXivPDFHTML
Abstract

Bayesian approach to inverse problems is studied in the case where the forward map is a linear hypoelliptic pseudodifferential operator and measurement error is additive white Gaussian noise. The measurement model for an unknown Gaussian random variable U(x,ω)U(x,\omega)U(x,ω) is \begin{eqnarray*} M(y,\omega) = A(U(x,\omega) )+ \delta\hspace{.2mm}\mathcal{E}(y,\omega), \end{eqnarray*} where AAA is a finitely many times smoothing linear hypoelliptic operator and δ>0\delta>0δ>0 is the noise magnitude. The covariance operator CUC_UCU​ of UUU is 2r2r2r times smoothing, self-adjoint, injective and elliptic pseudodifferential operator. If E\mathcal{E}E was taking values in L2L^2L2 then in Gaussian case solving the conditional mean (and maximum a posteriori) estimate is linked to solving the minimisation problem \begin{eqnarray*} T_\delta(M) = \text{argmin}_{u\in H^r} \big\{\|A u-m\|_{L^2}^2+ \delta^2\|C_U^{-1/2}u\|_{L^2}^2 \big\}. \end{eqnarray*} However, Gaussian white noise does not take values in L2L^2L2 but in H−sH^{-s}H−s where s>0s>0s>0 is big enough. A modification of the above approach to solve the inverse problem is presented, covering the case of white Gaussian measurement noise. Furthermore, the convergence of conditional mean estimate to the correct solution as δ→0\delta\rightarrow 0δ→0 is proven in appropriate function spaces using microlocal analysis. Also the contraction of the confidence regions is studied.

View on arXiv
Comments on this paper