ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2203.06973
14
9

Solving parametric partial differential equations with deep rectified quadratic unit neural networks

14 March 2022
Zhen Lei
Lei Shi
Chenyu Zeng
ArXivPDFHTML
Abstract

Implementing deep neural networks for learning the solution maps of parametric partial differential equations (PDEs) turns out to be more efficient than using many conventional numerical methods. However, limited theoretical analyses have been conducted on this approach. In this study, we investigate the expressive power of deep rectified quadratic unit (ReQU) neural networks for approximating the solution maps of parametric PDEs. The proposed approach is motivated by the recent important work of G. Kutyniok, P. Petersen, M. Raslan and R. Schneider (Gitta Kutyniok, Philipp Petersen, Mones Raslan, and Reinhold Schneider. A theoretical analysis of deep neural networks and parametric pdes. Constructive Approximation, pages 1-53, 2021), which uses deep rectified linear unit (ReLU) neural networks for solving parametric PDEs. In contrast to the previously established complexity-bound O(d3log⁡2q(1/ϵ))\mathcal{O}\left(d^3\log_{2}^{q}(1/ \epsilon) \right)O(d3log2q​(1/ϵ)) for ReLU neural networks, we derive an upper bound O(d3log⁡2qlog⁡2(1/ϵ))\mathcal{O}\left(d^3\log_{2}^{q}\log_{2}(1/ \epsilon) \right)O(d3log2q​log2​(1/ϵ)) on the size of the deep ReQU neural network required to achieve accuracy ϵ>0\epsilon>0ϵ>0, where ddd is the dimension of reduced basis representing the solutions. Our method takes full advantage of the inherent low-dimensionality of the solution manifolds and better approximation performance of deep ReQU neural networks. Numerical experiments are performed to verify our theoretical result.

View on arXiv
Comments on this paper