Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2005.00288
Cited By
Distilling Spikes: Knowledge Distillation in Spiking Neural Networks
1 May 2020
R. K. Kushawaha
S. Kumar
Biplab Banerjee
R. Velmurugan
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Distilling Spikes: Knowledge Distillation in Spiking Neural Networks"
8 / 8 papers shown
Title
Head-Tail-Aware KL Divergence in Knowledge Distillation for Spiking Neural Networks
Tianqing Zhang
Zixin Zhu
Kairong Yu
Hongwei Wang
250
0
0
29 Apr 2025
Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks
Kairong Yu
Chengting Yu
Tianqing Zhang
Xiaochen Zhao
Shu Yang
Hongwei Wang
Qiang Zhang
Qi Xu
74
3
0
05 Mar 2025
A Novel Spike Transformer Network for Depth Estimation from Event Cameras via Cross-modality Knowledge Distillation
Xin Zhang
Liangxiu Han
Tam Sobeih
Lianghao Han
Darren Dancey
61
2
0
26 Apr 2024
TT-SNN: Tensor Train Decomposition for Efficient Spiking Neural Network Training
Donghyun Lee
Ruokai Yin
Youngeun Kim
Abhishek Moitra
Yuhang Li
Priyadarshini Panda
26
3
0
15 Jan 2024
Biologically inspired structure learning with reverse knowledge distillation for spiking neural networks
Qi Xu
Yaxin Li
Xuanye Fang
Jiangrong Shen
Jian K. Liu
Huajin Tang
Gang Pan
19
15
0
19 Apr 2023
LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient Training in Deep Spiking Neural Networks
Di Hong
Jiangrong Shen
Yu Qi
Yueming Wang
25
5
0
17 Apr 2023
Constructing Deep Spiking Neural Networks from Artificial Neural Networks with Knowledge Distillation
Qi Xu
Yaxin Li
Jiangrong Shen
Jian K. Liu
Huajin Tang
Gang Pan
27
62
0
12 Apr 2023
Spatio-Temporal Pruning and Quantization for Low-latency Spiking Neural Networks
Sayeed Shafayet Chowdhury
Isha Garg
Kaushik Roy
26
38
0
26 Apr 2021
1