ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2406.18350
  4. Cited By
On Reducing Activity with Distillation and Regularization for Energy
  Efficient Spiking Neural Networks

On Reducing Activity with Distillation and Regularization for Energy Efficient Spiking Neural Networks

26 June 2024
Thomas Louis
Benoit Miramond
Alain Pegatoquet
Adrien Girard
ArXivPDFHTML

Papers citing "On Reducing Activity with Distillation and Regularization for Energy Efficient Spiking Neural Networks"

5 / 5 papers shown
Title
Optimizing the Consumption of Spiking Neural Networks with Activity
  Regularization
Optimizing the Consumption of Spiking Neural Networks with Activity Regularization
Simon Narduzzi
Siavash Bigdeli
Shih-Chii Liu
L. A. Dunbar
61
9
0
04 Apr 2022
Surrogate Gradient Learning in Spiking Neural Networks
Surrogate Gradient Learning in Spiking Neural Networks
Emre Neftci
Hesham Mostafa
Friedemann Zenke
76
1,224
0
28 Jan 2019
Direct Training for Spiking Neural Networks: Faster, Larger, Better
Direct Training for Spiking Neural Networks: Faster, Larger, Better
Yujie Wu
Lei Deng
Guoqi Li
Jun Zhu
Luping Shi
66
646
0
16 Sep 2018
Deep Residual Learning for Small-Footprint Keyword Spotting
Deep Residual Learning for Small-Footprint Keyword Spotting
Raphael Tang
Jimmy J. Lin
45
237
0
28 Oct 2017
FitNets: Hints for Thin Deep Nets
FitNets: Hints for Thin Deep Nets
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
250
3,862
0
19 Dec 2014
1