ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.03751
17
14

State Space Representations of Deep Neural Networks

11 June 2018
Michael Hauser
Sean Gunn
S. Saab
A. Ray
    AI4CE
ArXivPDFHTML
Abstract

This paper deals with neural networks as dynamical systems governed by differential or difference equations. It shows that the introduction of skip connections into network architectures, such as residual networks and dense networks, turns a system of static equations into a system of dynamical equations with varying levels of smoothness on the layer-wise transformations. Closed form solutions for the state space representations of general dense networks, as well as kthk^{th}kth order smooth networks, are found in general settings. Furthermore, it is shown that imposing kthk^{th}kth order smoothness on a network architecture with ddd-many nodes per layer increases the state space dimension by a multiple of kkk, and so the effective embedding dimension of the data manifold is k⋅dk \cdot dk⋅d-many dimensions. It follows that network architectures of these types reduce the number of parameters needed to maintain the same embedding dimension by a factor of k2k^2k2 when compared to an equivalent first-order, residual network, significantly motivating the development of network architectures of these types. Numerical simulations were run to validate parts of the developed theory.

View on arXiv
Comments on this paper