ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1909.04757
  4. Cited By
Boosting Throughput and Efficiency of Hardware Spiking Neural
  Accelerators using Time Compression Supporting Multiple Spike Codes

Boosting Throughput and Efficiency of Hardware Spiking Neural Accelerators using Time Compression Supporting Multiple Spike Codes

10 September 2019
Changqin Xu
Wenrui Zhang
Yu Liu
Peng Li
ArXivPDFHTML

Papers citing "Boosting Throughput and Efficiency of Hardware Spiking Neural Accelerators using Time Compression Supporting Multiple Spike Codes"

2 / 2 papers shown
Title
Direct Training via Backpropagation for Ultra-low Latency Spiking Neural
  Networks with Multi-threshold
Direct Training via Backpropagation for Ultra-low Latency Spiking Neural Networks with Multi-threshold
Changqin Xu
Yi Liu
Yintang Yang
17
12
0
25 Nov 2021
VOWEL: A Local Online Learning Rule for Recurrent Networks of
  Probabilistic Spiking Winner-Take-All Circuits
VOWEL: A Local Online Learning Rule for Recurrent Networks of Probabilistic Spiking Winner-Take-All Circuits
Hyeryung Jang
N. Skatchkovsky
Osvaldo Simeone
32
11
0
20 Apr 2020
1