ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.03486
52
18
v1v2v3v4 (latest)

On statistical Calderón problems

8 June 2019
Kweku Abraham
Richard Nickl
ArXiv (abs)PDFHTML
Abstract

For DDD a bounded domain in Rd,d≥2,\mathbb R^d, d \ge 2,Rd,d≥2, with smooth boundary ∂D\partial D∂D, the non-linear inverse problem of recovering the unknown conductivity γ\gammaγ determining solutions u=uγ,fu=u_{\gamma, f}u=uγ,f​ of the partial differential equation \begin{equation*} \begin{split} \nabla \cdot(\gamma \nabla u)&=0 \quad \text{ in }D, \\ u&=f \quad \text { on } \partial D, \end{split} \end{equation*} from noisy observations YYY of the Dirichlet-to-Neumann map \[f \mapsto \Lambda_\gamma(f) = {\gamma \frac{\partial u_{\gamma,f}}{\partial \nu}}\Big|_{\partial D},\] with ∂/∂ν\partial/\partial \nu∂/∂ν denoting the outward normal derivative, is considered. The data YYY consists of Λγ\Lambda_\gammaΛγ​ corrupted by additive Gaussian noise at noise level ε>0\varepsilon>0ε>0, and a statistical algorithm γ^(Y)\hat \gamma(Y)γ^​(Y) is constructed which is shown to recover γ\gammaγ in supremum-norm loss at a statistical convergence rate of the order log⁡(1/ε)−δ\log(1/\varepsilon)^{-\delta}log(1/ε)−δ as ε→0\varepsilon \to 0ε→0. It is further shown that this convergence rate is optimal, up to the precise value of the exponent δ>0\delta>0δ>0, in an information theoretic sense. The estimator γ^(Y)\hat \gamma(Y)γ^​(Y) has a Bayesian interpretation in terms of the posterior mean of a suitable Gaussian process prior and can be computed by MCMC methods.

View on arXiv
Comments on this paper