Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1802.05668
Cited By
Model compression via distillation and quantization
15 February 2018
A. Polino
Razvan Pascanu
Dan Alistarh
MQ
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Model compression via distillation and quantization"
50 / 171 papers shown
Title
Auto Graph Encoder-Decoder for Neural Network Pruning
Sixing Yu
Arya Mazaheri
Ali Jannesari
GNN
27
38
0
25 Nov 2020
Neural Network Compression Via Sparse Optimization
Tianyi Chen
Bo Ji
Yixin Shi
Tianyu Ding
Biyi Fang
Sheng Yi
Xiao Tu
36
15
0
10 Nov 2020
Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks
Yoonho Boo
Sungho Shin
Jungwook Choi
Wonyong Sung
MQ
30
29
0
30 Sep 2020
TernaryBERT: Distillation-aware Ultra-low Bit BERT
Wei Zhang
Lu Hou
Yichun Yin
Lifeng Shang
Xiao Chen
Xin Jiang
Qun Liu
MQ
35
209
0
27 Sep 2020
Efficient Transformer-based Large Scale Language Representations using Hardware-friendly Block Structured Pruning
Bingbing Li
Zhenglun Kong
Tianyun Zhang
Ji Li
Zechao Li
Hang Liu
Caiwen Ding
VLM
32
64
0
17 Sep 2020
Classification of Diabetic Retinopathy Using Unlabeled Data and Knowledge Distillation
Sajjad Abbasi
M. Hajabdollahi
P. Khadivi
N. Karimi
Roshank Roshandel
S. Shirani
S. Samavi
14
18
0
01 Sep 2020
Compression of Deep Learning Models for Text: A Survey
Manish Gupta
Puneet Agrawal
VLM
MedIm
AI4CE
22
115
0
12 Aug 2020
NASB: Neural Architecture Search for Binary Convolutional Neural Networks
Baozhou Zhu
Zaid Al-Ars
P. Hofstee
MQ
24
23
0
08 Aug 2020
Split Computing for Complex Object Detectors: Challenges and Preliminary Results
Yoshitomo Matsubara
Marco Levorato
51
25
0
27 Jul 2020
HMQ: Hardware Friendly Mixed Precision Quantization Block for CNNs
H. Habi
Roy H. Jennings
Arnon Netzer
MQ
29
65
0
20 Jul 2020
Distillation Guided Residual Learning for Binary Convolutional Neural Networks
Jianming Ye
Shiliang Zhang
Jingdong Wang
MQ
30
19
0
10 Jul 2020
Tracking-by-Trackers with a Distilled and Reinforced Model
Matteo Dunnhofer
N. Martinel
C. Micheloni
VOT
OffRL
27
4
0
08 Jul 2020
Operation-Aware Soft Channel Pruning using Differentiable Masks
Minsoo Kang
Bohyung Han
AAML
40
139
0
08 Jul 2020
A Sequential Self Teaching Approach for Improving Generalization in Sound Event Recognition
Anurag Kumar
V. Ithapu
25
35
0
30 Jun 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
31
2,857
0
09 Jun 2020
An Overview of Neural Network Compression
James OÑeill
AI4CE
52
98
0
05 Jun 2020
TIMELY: Pushing Data Movements and Interfaces in PIM Accelerators Towards Local and in Time Domain
Weitao Li
Pengfei Xu
Yang Zhao
Haitong Li
Yuan Xie
Yingyan Lin
17
69
0
03 May 2020
Binary Neural Networks: A Survey
Haotong Qin
Ruihao Gong
Xianglong Liu
Xiao Bai
Jingkuan Song
N. Sebe
MQ
55
459
0
31 Mar 2020
Introducing Pose Consistency and Warp-Alignment for Self-Supervised 6D Object Pose Estimation in Color Images
Juil Sock
Guillermo Garcia-Hernando
Anil Armagan
Tae-Kyun Kim
29
5
0
27 Mar 2020
GAN Compression: Efficient Architectures for Interactive Conditional GANs
Zhekai Zhang
Ji Lin
Yaoyao Ding
Zhijian Liu
Jun-Yan Zhu
Song Han
GAN
24
2
0
19 Mar 2020
Ternary Compression for Communication-Efficient Federated Learning
Jinjin Xu
W. Du
Ran Cheng
Wangli He
Yaochu Jin
MQ
FedML
47
174
0
07 Mar 2020
MeliusNet: Can Binary Neural Networks Achieve MobileNet-level Accuracy?
Joseph Bethge
Christian Bartz
Haojin Yang
Ying-Cong Chen
Christoph Meinel
MQ
25
91
0
16 Jan 2020
Noisy Machines: Understanding Noisy Neural Networks and Enhancing Robustness to Analog Hardware Errors Using Distillation
Chuteng Zhou
Prad Kadambi
Matthew Mattina
P. Whatmough
23
35
0
14 Jan 2020
Resource-Efficient Neural Networks for Embedded Systems
Wolfgang Roth
Günther Schindler
Lukas Pfeifenberger
Robert Peharz
Sebastian Tschiatschek
Holger Fröning
Franz Pernkopf
Zoubin Ghahramani
34
47
0
07 Jan 2020
Modeling Teacher-Student Techniques in Deep Neural Networks for Knowledge Distillation
Sajjad Abbasi
M. Hajabdollahi
N. Karimi
S. Samavi
17
28
0
31 Dec 2019
FQ-Conv: Fully Quantized Convolution for Efficient and Accurate Inference
Bram-Ernst Verhoef
Nathan Laubeuf
S. Cosemans
P. Debacker
Ioannis A. Papistas
A. Mallik
D. Verkest
MQ
19
16
0
19 Dec 2019
Online Knowledge Distillation with Diverse Peers
Defang Chen
Jian-Ping Mei
Can Wang
Yan Feng
Chun-Yen Chen
FedML
36
297
0
01 Dec 2019
QKD: Quantization-aware Knowledge Distillation
Jangho Kim
Yash Bhalgat
Jinwon Lee
Chirag I. Patel
Nojun Kwak
MQ
29
64
0
28 Nov 2019
Iteratively Training Look-Up Tables for Network Quantization
Fabien Cardinaux
Stefan Uhlich
K. Yoshiyama
Javier Alonso García
Lukas Mauch
Stephen Tiedemann
Thomas Kemp
Akira Nakamura
MQ
27
16
0
12 Nov 2019
A Simplified Fully Quantized Transformer for End-to-end Speech Recognition
Alex Bie
Bharat Venkitesh
João Monteiro
Md. Akmal Haidar
Mehdi Rezagholizadeh
MQ
32
27
0
09 Nov 2019
Explainable Artificial Intelligence (XAI): Concepts, Taxonomies, Opportunities and Challenges toward Responsible AI
Alejandro Barredo Arrieta
Natalia Díaz Rodríguez
Javier Del Ser
Adrien Bennetot
Siham Tabik
...
S. Gil-Lopez
Daniel Molina
Richard Benjamins
Raja Chatila
Francisco Herrera
XAI
44
6,125
0
22 Oct 2019
Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System
Ze Yang
Linjun Shou
Ming Gong
Wutao Lin
Daxin Jiang
28
92
0
18 Oct 2019
Fully Quantized Transformer for Machine Translation
Gabriele Prato
Ella Charlaix
Mehdi Rezagholizadeh
MQ
13
68
0
17 Oct 2019
Impact of Low-bitwidth Quantization on the Adversarial Robustness for Embedded Neural Networks
Rémi Bernhard
Pierre-Alain Moëllic
J. Dutertre
AAML
MQ
29
18
0
27 Sep 2019
Structured Binary Neural Networks for Image Recognition
Bohan Zhuang
Chunhua Shen
Mingkui Tan
Peng Chen
Lingqiao Liu
Ian Reid
MQ
27
17
0
22 Sep 2019
Knowledge Distillation for End-to-End Person Search
Bharti Munjal
Fabio Galasso
S. Amin
FedML
48
15
0
03 Sep 2019
Patient Knowledge Distillation for BERT Model Compression
S. Sun
Yu Cheng
Zhe Gan
Jingjing Liu
78
832
0
25 Aug 2019
Effective Training of Convolutional Neural Networks with Low-bitwidth Weights and Activations
Bohan Zhuang
Jing Liu
Mingkui Tan
Lingqiao Liu
Ian Reid
Chunhua Shen
MQ
31
45
0
10 Aug 2019
Distilling Knowledge From a Deep Pose Regressor Network
Muhamad Risqi U. Saputra
Pedro Porto Buarque de Gusmão
Yasin Almalioglu
Andrew Markham
A. Trigoni
25
102
0
02 Aug 2019
Lifelong GAN: Continual Learning for Conditional Image Generation
Mengyao Zhai
Lei Chen
Frederick Tung
Jiawei He
Megha Nawhal
Greg Mori
CLL
36
180
0
23 Jul 2019
Similarity-Preserving Knowledge Distillation
Frederick Tung
Greg Mori
48
961
0
23 Jul 2019
Distill-2MD-MTL: Data Distillation based on Multi-Dataset Multi-Domain Multi-Task Frame Work to Solve Face Related Tasksks, Multi Task Learning, Semi-Supervised Learning
Sepidehsadat Hosseini
M. Shabani
N. Cho
CVBM
36
3
0
08 Jul 2019
SLSNet: Skin lesion segmentation using a lightweight generative adversarial network
Md. Mostafa Kamal Sarker
Hatem A. Rashwan
Farhan Akram
V. Singh
Syeda Furruka Banu
...
Kabir Ahmed Choudhury
Sylvie Chambon
Petia Radeva
D. Puig
M. Abdel-Nasser
GAN
MedIm
39
27
0
01 Jul 2019
Divide and Conquer: Leveraging Intermediate Feature Representations for Quantized Training of Neural Networks
Ahmed T. Elthakeb
Prannoy Pilligundla
Alex Cloninger
H. Esmaeilzadeh
MQ
26
8
0
14 Jun 2019
Fighting Quantization Bias With Bias
Alexander Finkelstein
Uri Almog
Mark Grobman
MQ
28
56
0
07 Jun 2019
Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization
K. Helwegen
James Widdicombe
Lukas Geiger
Zechun Liu
K. Cheng
Roeland Nusselder
MQ
27
110
0
05 Jun 2019
Multi-Precision Quantized Neural Networks via Encoding Decomposition of -1 and +1
Qigong Sun
Fanhua Shang
Kan Yang
Xiufang Li
Yan Ren
L. Jiao
MQ
46
12
0
31 May 2019
MobiVSR: A Visual Speech Recognition Solution for Mobile Devices
Nilay Shrivastava
Astitwa Saxena
Yaman Kumar Singla
Preeti Kaur
Debanjan Mahata
R. Shah
27
3
0
10 May 2019
High Frequency Residual Learning for Multi-Scale Image Classification
Bowen Cheng
Rong Xiao
Jianfeng Wang
Thomas Huang
Lei Zhang
34
21
0
07 May 2019
Relational Knowledge Distillation
Wonpyo Park
Dongju Kim
Yan Lu
Minsu Cho
16
1,387
0
10 Apr 2019
Previous
1
2
3
4
Next