Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2209.09815
Cited By
Towards Fine-tuning Pre-trained Language Models with Integer Forward and Backward Propagation
20 September 2022
Mohammadreza Tayaranian
Alireza Ghaffari
Marzieh S. Tahaei
Mehdi Rezagholizadeh
M. Asgharian
V. Nia
MQ
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Towards Fine-tuning Pre-trained Language Models with Integer Forward and Backward Propagation"
4 / 4 papers shown
Title
I-BERT: Integer-only BERT Quantization
Sehoon Kim
A. Gholami
Z. Yao
Michael W. Mahoney
Kurt Keutzer
MQ
105
341
0
05 Jan 2021
BinaryBERT: Pushing the Limit of BERT Quantization
Haoli Bai
Wei Zhang
Lu Hou
Lifeng Shang
Jing Jin
Xin Jiang
Qun Liu
Michael Lyu
Irwin King
MQ
142
221
0
31 Dec 2020
Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT
Sheng Shen
Zhen Dong
Jiayu Ye
Linjian Ma
Z. Yao
A. Gholami
Michael W. Mahoney
Kurt Keutzer
MQ
233
576
0
12 Sep 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
299
6,984
0
20 Apr 2018
1