ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.03144
  4. Cited By
Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks

Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks

5 March 2025
Kairong Yu
Chengting Yu
Tianqing Zhang
Xiaochen Zhao
Shu Yang
Hongwei Wang
Qiang Zhang
Qi Xu
ArXivPDFHTML

Papers citing "Temporal Separation with Entropy Regularization for Knowledge Distillation in Spiking Neural Networks"

2 / 2 papers shown
Title
TS-SNN: Temporal Shift Module for Spiking Neural Networks
TS-SNN: Temporal Shift Module for Spiking Neural Networks
Kairong Yu
Tianqing Zhang
Qi Xu
Gang Pan
Hongwei Wang
147
0
0
07 May 2025
Head-Tail-Aware KL Divergence in Knowledge Distillation for Spiking Neural Networks
Head-Tail-Aware KL Divergence in Knowledge Distillation for Spiking Neural Networks
Tianqing Zhang
Zixin Zhu
Kairong Yu
Hongwei Wang
154
0
0
29 Apr 2025
1