ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1810.02068
  4. Cited By
Towards Fast and Energy-Efficient Binarized Neural Network Inference on
  FPGA

Towards Fast and Energy-Efficient Binarized Neural Network Inference on FPGA

4 October 2018
Cheng Fu
Shilin Zhu
Hao Su
Ching-En Lee
Jishen Zhao
    MQ
ArXivPDFHTML

Papers citing "Towards Fast and Energy-Efficient Binarized Neural Network Inference on FPGA"

5 / 5 papers shown
Title
RedBit: An End-to-End Flexible Framework for Evaluating the Accuracy of
  Quantized CNNs
RedBit: An End-to-End Flexible Framework for Evaluating the Accuracy of Quantized CNNs
A. M. Ribeiro-dos-Santos
João Dinis Ferreira
O. Mutlu
G. Falcão
MQ
21
1
0
15 Jan 2023
Guarding Machine Learning Hardware Against Physical Side-Channel Attacks
Guarding Machine Learning Hardware Against Physical Side-Channel Attacks
Anuj Dubey
Rosario Cammarota
Vikram B. Suresh
Aydin Aysu
AAML
30
31
0
01 Sep 2021
Binary Neural Networks: A Survey
Binary Neural Networks: A Survey
Haotong Qin
Ruihao Gong
Xianglong Liu
Xiao Bai
Jingkuan Song
N. Sebe
MQ
50
458
0
31 Mar 2020
Binary Ensemble Neural Network: More Bits per Network or More Networks
  per Bit?
Binary Ensemble Neural Network: More Bits per Network or More Networks per Bit?
Shilin Zhu
Xin Dong
Hao Su
MQ
30
135
0
20 Jun 2018
Incremental Network Quantization: Towards Lossless CNNs with
  Low-Precision Weights
Incremental Network Quantization: Towards Lossless CNNs with Low-Precision Weights
Aojun Zhou
Anbang Yao
Yiwen Guo
Lin Xu
Yurong Chen
MQ
337
1,049
0
10 Feb 2017
1