ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1805.07978
  4. Cited By
Energy-Efficient Inference Accelerator for Memory-Augmented Neural
  Networks on an FPGA

Energy-Efficient Inference Accelerator for Memory-Augmented Neural Networks on an FPGA

21 May 2018
Seongsik Park
Jaehee Jang
Seijoon Kim
Sungroh Yoon
ArXivPDFHTML

Papers citing "Energy-Efficient Inference Accelerator for Memory-Augmented Neural Networks on an FPGA"

2 / 2 papers shown
Title
HiMA: A Fast and Scalable History-based Memory Access Engine for
  Differentiable Neural Computer
HiMA: A Fast and Scalable History-based Memory Access Engine for Differentiable Neural Computer
Yaoyu Tao
Zhengya Zhang
30
5
0
15 Feb 2022
Fast and Efficient Information Transmission with Burst Spikes in Deep
  Spiking Neural Networks
Fast and Efficient Information Transmission with Burst Spikes in Deep Spiking Neural Networks
Seongsik Park
Seijoon Kim
Hyeokjun Choe
Sungroh Yoon
23
94
0
10 Sep 2018
1