ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1612.04799
72
12
v1v2 (latest)

Deep Function Machines: Generalized Neural Networks for Topological Layer Expression

14 December 2016
William H. Guss
    AI4CE
ArXiv (abs)PDFHTML
Abstract

In this paper we propose a generalization of deep neural networks called deep function machines (DFMs). DFMs act on vector spaces of arbitrary (possibly infinite) dimension and we show that a family of DFMs are invariant to the dimension of input data; that is, the parameterization of the model does not directly hinge on the quality of the input (eg. high resolution images). Using this generalization we provide a new theory of universal approximation of bounded non-linear operators between function spaces locally compact Hausdorff spaces. We then suggest that DFMs provide an expressive framework for designing new neural network layer types with topological considerations in mind. Finally, we provide several examples of DFMs and in particular give a practical algorithm for neural networks approximating infinite dimensional operators.

View on arXiv
Comments on this paper