Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2401.11664
Cited By
Zero-Space Cost Fault Tolerance for Transformer-based Language Models on ReRAM
22 January 2024
Bingbing Li
Geng Yuan
Zigeng Wang
Shaoyi Huang
Hongwu Peng
Payman Behnam
Wujie Wen
Hang Liu
Caiwen Ding
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Zero-Space Cost Fault Tolerance for Transformer-based Language Models on ReRAM"
5 / 5 papers shown
Title
Computing-In-Memory Neural Network Accelerators for Safety-Critical Systems: Can Small Device Variations Be Disastrous?
Zheyu Yan
X. S. Hu
Yiyu Shi
67
20
0
15 Jul 2022
FORMS: Fine-grained Polarized ReRAM-based In-situ Computation for Mixed-signal DNN Accelerator
Geng Yuan
Payman Behnam
Zhengang Li
Ali Shafiee
Sheng Lin
...
Hang Liu
Xuehai Qian
M. N. Bojnordi
Yanzhi Wang
Caiwen Ding
38
68
0
16 Jun 2021
Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned
Elena Voita
David Talbot
F. Moiseev
Rico Sennrich
Ivan Titov
76
1,120
0
23 May 2019
DARTS: Differentiable Architecture Search
Hanxiao Liu
Karen Simonyan
Yiming Yang
167
4,326
0
24 Jun 2018
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding
Song Han
Huizi Mao
W. Dally
3DGS
203
8,793
0
01 Oct 2015
1