ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.03553
  4. Cited By
A Spike in Performance: Training Hybrid-Spiking Neural Networks with
  Quantized Activation Functions

A Spike in Performance: Training Hybrid-Spiking Neural Networks with Quantized Activation Functions

10 February 2020
Aaron R. Voelker
Daniel Rasmussen
C. Eliasmith
ArXivPDFHTML

Papers citing "A Spike in Performance: Training Hybrid-Spiking Neural Networks with Quantized Activation Functions"

2 / 2 papers shown
Title
The fine line between dead neurons and sparsity in binarized spiking
  neural networks
The fine line between dead neurons and sparsity in binarized spiking neural networks
Jason Eshraghian
Wei D. Lu
46
19
0
28 Jan 2022
Long short-term memory and learning-to-learn in networks of spiking
  neurons
Long short-term memory and learning-to-learn in networks of spiking neurons
G. Bellec
Darjan Salaj
Anand Subramoney
Robert Legenstein
Wolfgang Maass
121
481
0
26 Mar 2018
1