ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.16910
13
3

Universal approximation with complex-valued deep narrow neural networks

26 May 2023
Paul Geuchen
Thomas Jahn
Hannes Matt
ArXivPDFHTML
Abstract

We study the universality of complex-valued neural networks with bounded widths and arbitrary depths. Under mild assumptions, we give a full description of those activation functions ϱ:C→C\varrho:\mathbb{C}\to \mathbb{C}ϱ:C→C that have the property that their associated networks are universal, i.e., are capable of approximating continuous functions to arbitrary accuracy on compact domains. Precisely, we show that deep narrow complex-valued networks are universal if and only if their activation function is neither holomorphic, nor antiholomorphic, nor R\mathbb{R}R-affine. This is a much larger class of functions than in the dual setting of arbitrary width and fixed depth. Unlike in the real case, the sufficient width differs significantly depending on the considered activation function. We show that a width of 2n+2m+52n+2m+52n+2m+5 is always sufficient and that in general a width of max⁡{2n,2m}\max\{2n,2m\}max{2n,2m} is necessary. We prove, however, that a width of n+m+4n+m+4n+m+4 suffices for a rich subclass of the admissible activation functions. Here, nnn and mmm denote the input and output dimensions of the considered networks.

View on arXiv
Comments on this paper