ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.18023
77
0

Time to Spike? Understanding the Representational Power of Spiking Neural Networks in Discrete Time

23 May 2025
Duc Anh Nguyen
Ernesto Araya
Adalbert Fono
Gitta Kutyniok
ArXivPDFHTML
Abstract

Recent years have seen significant progress in developing spiking neural networks (SNNs) as a potential solution to the energy challenges posed by conventional artificial neural networks (ANNs). However, our theoretical understanding of SNNs remains relatively limited compared to the ever-growing body of literature on ANNs. In this paper, we study a discrete-time model of SNNs based on leaky integrate-and-fire (LIF) neurons, referred to as discrete-time LIF-SNNs, a widely used framework that still lacks solid theoretical foundations. We demonstrate that discrete-time LIF-SNNs with static inputs and outputs realize piecewise constant functions defined on polyhedral regions, and more importantly, we quantify the network size required to approximate continuous functions. Moreover, we investigate the impact of latency (number of time steps) and depth (number of layers) on the complexity of the input space partitioning induced by discrete-time LIF-SNNs. Our analysis highlights the importance of latency and contrasts these networks with ANNs employing piecewise linear activation functions. Finally, we present numerical experiments to support our theoretical findings.

View on arXiv
@article{nguyen2025_2505.18023,
  title={ Time to Spike? Understanding the Representational Power of Spiking Neural Networks in Discrete Time },
  author={ Duc Anh Nguyen and Ernesto Araya and Adalbert Fono and Gitta Kutyniok },
  journal={arXiv preprint arXiv:2505.18023},
  year={ 2025 }
}
Comments on this paper