ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2304.12949
  4. Cited By
eFAT: Improving the Effectiveness of Fault-Aware Training for Mitigating
  Permanent Faults in DNN Hardware Accelerators

eFAT: Improving the Effectiveness of Fault-Aware Training for Mitigating Permanent Faults in DNN Hardware Accelerators

20 April 2023
Muhammad Abdullah Hanif
Mohamed Bennai
ArXiv (abs)PDFHTML

Papers citing "eFAT: Improving the Effectiveness of Fault-Aware Training for Mitigating Permanent Faults in DNN Hardware Accelerators"

4 / 4 papers shown
Title
Analyzing and Mitigating the Impact of Permanent Faults on a Systolic
  Array Based Neural Network Accelerator
Analyzing and Mitigating the Impact of Permanent Faults on a Systolic Array Based Neural Network Accelerator
Jeff Zhang
Tianyu Gu
K. Basu
S. Garg
24
135
0
11 Feb 2018
Bit Fusion: Bit-Level Dynamically Composable Architecture for
  Accelerating Deep Neural Networks
Bit Fusion: Bit-Level Dynamically Composable Architecture for Accelerating Deep Neural Networks
Hardik Sharma
Jongse Park
Naveen Suda
Liangzhen Lai
Benson Chau
Joo-Young Kim
Vikas Chandra
H. Esmaeilzadeh
MQ
61
492
0
05 Dec 2017
In-Datacenter Performance Analysis of a Tensor Processing Unit
In-Datacenter Performance Analysis of a Tensor Processing Unit
N. Jouppi
C. Young
Nishant Patil
David Patterson
Gaurav Agrawal
...
Vijay Vasudevan
Richard Walter
Walter Wang
Eric Wilcox
Doe Hyun Yoon
237
4,644
0
16 Apr 2017
Efficient Processing of Deep Neural Networks: A Tutorial and Survey
Efficient Processing of Deep Neural Networks: A Tutorial and Survey
Vivienne Sze
Yu-hsin Chen
Tien-Ju Yang
J. Emer
AAML3DV
120
3,026
0
27 Mar 2017
1