Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2201.08539
Cited By
AutoDistill: an End-to-End Framework to Explore and Distill Hardware-Efficient Language Models
21 January 2022
Xiaofan Zhang
Zongwei Zhou
Deming Chen
Yu Emma Wang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"AutoDistill: an End-to-End Framework to Explore and Distill Hardware-Efficient Language Models"
4 / 4 papers shown
Title
Domain Adaptable Fine-Tune Distillation Framework For Advancing Farm Surveillance
Raza Imam
Muhammad Huzaifa
Nabil Mansour
Shaher Bano Mirza
Fouad Lamghari
25
0
0
10 Feb 2024
Design Automation for Fast, Lightweight, and Effective Deep Learning Models: A Survey
Dalin Zhang
Kaixuan Chen
Yan Zhao
B. Yang
Li-Ping Yao
Christian S. Jensen
48
3
0
22 Aug 2022
Q-BERT: Hessian Based Ultra Low Precision Quantization of BERT
Sheng Shen
Zhen Dong
Jiayu Ye
Linjian Ma
Z. Yao
A. Gholami
Michael W. Mahoney
Kurt Keutzer
MQ
236
578
0
12 Sep 2019
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
304
6,996
0
20 Apr 2018
1