ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.00938
  4. Cited By
LUTNet: Rethinking Inference in FPGA Soft Logic

LUTNet: Rethinking Inference in FPGA Soft Logic

1 April 2019
Erwei Wang
James J. Davis
P. Cheung
George A. Constantinides
ArXivPDFHTML

Papers citing "LUTNet: Rethinking Inference in FPGA Soft Logic"

6 / 6 papers shown
Title
NeuraLUT: Hiding Neural Network Density in Boolean Synthesizable
  Functions
NeuraLUT: Hiding Neural Network Density in Boolean Synthesizable Functions
Marta Andronic
George A. Constantinides
29
5
0
29 Feb 2024
A Lightweight FPGA-based IDS-ECU Architecture for Automotive CAN
A Lightweight FPGA-based IDS-ECU Architecture for Automotive CAN
Shashwat Khandelwal
Shanker Shreejith
19
13
0
19 Jan 2024
A Scalable, Interpretable, Verifiable & Differentiable Logic Gate
  Convolutional Neural Network Architecture From Truth Tables
A Scalable, Interpretable, Verifiable & Differentiable Logic Gate Convolutional Neural Network Architecture From Truth Tables
Adrien Benamira
Tristan Guérand
Thomas Peyrin
Trevor Yap
Bryan Hooi
40
1
0
18 Aug 2022
Enabling Binary Neural Network Training on the Edge
Enabling Binary Neural Network Training on the Edge
Erwei Wang
James J. Davis
Daniele Moro
Piotr Zielinski
Jia Jie Lim
C. Coelho
S. Chatterjee
P. Cheung
George A. Constantinides
MQ
20
24
0
08 Feb 2021
Reverse Derivative Ascent: A Categorical Approach to Learning Boolean
  Circuits
Reverse Derivative Ascent: A Categorical Approach to Learning Boolean Circuits
Paul W. Wilson
Fabio Zanasi
41
15
0
26 Jan 2021
LUTNet: Learning FPGA Configurations for Highly Efficient Neural Network
  Inference
LUTNet: Learning FPGA Configurations for Highly Efficient Neural Network Inference
Erwei Wang
James J. Davis
P. Cheung
George A. Constantinides
MQ
9
41
0
24 Oct 2019
1