ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.00491
11
40

Learning elliptic partial differential equations with randomized linear algebra

31 January 2021
Nicolas Boullé
Alex Townsend
ArXivPDFHTML
Abstract

Given input-output pairs of an elliptic partial differential equation (PDE) in three dimensions, we derive the first theoretically-rigorous scheme for learning the associated Green's function GGG. By exploiting the hierarchical low-rank structure of GGG, we show that one can construct an approximant to GGG that converges almost surely and achieves a relative error of O(Γϵ−1/2log⁡3(1/ϵ)ϵ)\mathcal{O}(\Gamma_\epsilon^{-1/2}\log^3(1/\epsilon)\epsilon)O(Γϵ−1/2​log3(1/ϵ)ϵ) using at most O(ϵ−6log⁡4(1/ϵ))\mathcal{O}(\epsilon^{-6}\log^4(1/\epsilon))O(ϵ−6log4(1/ϵ)) input-output training pairs with high probability, for any 0<ϵ<10<\epsilon<10<ϵ<1. The quantity 0<Γϵ≤10<\Gamma_\epsilon\leq 10<Γϵ​≤1 characterizes the quality of the training dataset. Along the way, we extend the randomized singular value decomposition algorithm for learning matrices to Hilbert--Schmidt operators and characterize the quality of covariance kernels for PDE learning.

View on arXiv
Comments on this paper