Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1612.03928
Cited By
Paying More Attention to Attention: Improving the Performance of Convolutional Neural Networks via Attention Transfer
12 December 2016
Sergey Zagoruyko
N. Komodakis
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Paying More Attention to Attention: Improving the Performance of Convolutional Neural Networks via Attention Transfer"
50 / 1,157 papers shown
Title
Sequence-to-Sequence Learning via Attention Transfer for Incremental Speech Recognition
Sashi Novitasari
Andros Tjandra
S. Sakti
Satoshi Nakamura
CLL
6
12
0
04 Nov 2020
Unsupervised Attention Based Instance Discriminative Learning for Person Re-Identification
Kshitij Nikhal
B. Riggan
18
5
0
03 Nov 2020
Distilling Knowledge by Mimicking Features
G. Wang
Yifan Ge
Jianxin Wu
17
33
0
03 Nov 2020
Parameter Efficient Deep Neural Networks with Bilinear Projections
Litao Yu
Yongsheng Gao
Jun Zhou
Jian Zhang
21
1
0
03 Nov 2020
ProxylessKD: Direct Knowledge Distillation with Inherited Classifier for Face Recognition
W. Shi
Guanghui Ren
Yunpeng Chen
Shuicheng Yan
CVBM
34
7
0
31 Oct 2020
Class-incremental learning: survey and performance evaluation on image classification
Marc Masana
Xialei Liu
Bartlomiej Twardowski
Mikel Menta
Andrew D. Bagdanov
Joost van de Weijer
CLL
37
666
0
28 Oct 2020
Attribution Preservation in Network Compression for Reliable Network Interpretation
Geondo Park
J. Yang
Sung Ju Hwang
Eunho Yang
20
5
0
28 Oct 2020
CompRess: Self-Supervised Learning by Compressing Representations
Soroush Abbasi Koohpayegani
Ajinkya Tejankar
Hamed Pirsiavash
SSL
23
89
0
28 Oct 2020
Discriminative feature generation for classification of imbalanced data
Sungho Suh
P. Lukowicz
Y. Lee
43
21
0
24 Oct 2020
Comprehensive Attention Self-Distillation for Weakly-Supervised Object Detection
Zeyi Huang
Yang Zou
V. Bhagavatula
Dong Huang
WSOD
28
120
0
22 Oct 2020
A Survey on Deep Learning and Explainability for Automatic Report Generation from Medical Images
Pablo Messina
Pablo Pino
Denis Parra
Alvaro Soto
Cecilia Besa
S. Uribe
Marcelo andía
C. Tejos
Claudia Prieto
Daniel Capurro
MedIm
36
62
0
20 Oct 2020
Towards Accurate Knowledge Transfer via Target-awareness Representation Disentanglement
Xingjian Li
Di Hu
Xuhong Li
Haoyi Xiong
Zhiquan Ye
Zhipeng Wang
Chengzhong Xu
Dejing Dou
AAML
17
0
0
16 Oct 2020
Reducing the Teacher-Student Gap via Spherical Knowledge Disitllation
Jia Guo
Minghao Chen
Yao Hu
Chen Zhu
Xiaofei He
Deng Cai
26
6
0
15 Oct 2020
Towards Accurate Quantization and Pruning via Data-free Knowledge Transfer
Chen Zhu
Zheng Xu
Ali Shafahi
Manli Shu
Amin Ghiasi
Tom Goldstein
MQ
22
3
0
14 Oct 2020
Measuring Visual Generalization in Continuous Control from Pixels
J. E. Grigsby
Yanjun Qi
14
25
0
13 Oct 2020
Top-DB-Net: Top DropBlock for Activation Enhancement in Person Re-Identification
Rodolfo Quispe
Hélio Pedrini
16
43
0
12 Oct 2020
Locally Linear Region Knowledge Distillation
Xiang Deng
Zhongfei Zhang
Zhang
25
0
0
09 Oct 2020
Be Your Own Best Competitor! Multi-Branched Adversarial Knowledge Transfer
Mahdi Ghorbani
Fahimeh Fooladgar
S. Kasaei
AAML
39
0
0
09 Oct 2020
Tatum-Level Drum Transcription Based on a Convolutional Recurrent Neural Network with Language Model-Based Regularized Training
Ryoto Ishizuka
Ryo Nishikimi
Eita Nakamura
Kazuyoshi Yoshii
30
4
0
08 Oct 2020
Towards Cross-modality Medical Image Segmentation with Online Mutual Knowledge Distillation
Kang Li
Lequan Yu
Shujun Wang
Pheng-Ann Heng
36
103
0
04 Oct 2020
UCP: Uniform Channel Pruning for Deep Convolutional Neural Networks Compression and Acceleration
Jingfei Chang
Yang Lu
Ping Xue
Xing Wei
Zhen Wei
22
2
0
03 Oct 2020
Neighbourhood Distillation: On the benefits of non end-to-end distillation
Laetitia Shao
Max Moroz
Elad Eban
Yair Movshovitz-Attias
ODL
18
0
0
02 Oct 2020
Online Knowledge Distillation via Multi-branch Diversity Enhancement
Zheng Li
Ying Huang
Defang Chen
Tianren Luo
Ning Cai
Zhigeng Pan
19
27
0
02 Oct 2020
Improved Knowledge Distillation via Full Kernel Matrix Transfer
Qi Qian
Hao Li
Juhua Hu
14
7
0
30 Sep 2020
TinyGAN: Distilling BigGAN for Conditional Image Generation
Ting-Yun Chang
Chi-Jen Lu
GAN
22
27
0
29 Sep 2020
Kernel Based Progressive Distillation for Adder Neural Networks
Yixing Xu
Chang Xu
Xinghao Chen
Wei Zhang
Chunjing Xu
Yunhe Wang
43
47
0
28 Sep 2020
A Computer Vision Approach to Combat Lyme Disease
Sina Akbarian
Tania Cawston
Laurent Moreno
Samir B. Patel
Vanessa Allen
Elham Dolatabadi
13
2
0
24 Sep 2020
Unsupervised Transfer Learning for Spatiotemporal Predictive Networks
Zhiyu Yao
Yunbo Wang
Mingsheng Long
Jianmin Wang
AI4TS
28
18
0
24 Sep 2020
MimicDet: Bridging the Gap Between One-Stage and Two-Stage Object Detection
Xin Lu
Quanquan Li
Buyu Li
Junjie Yan
ObjD
29
52
0
24 Sep 2020
Dual-path CNN with Max Gated block for Text-Based Person Re-identification
Tinghuai Ma
Mingming Yang
Huan Rong
Yurong Qian
Yurong Qian
Y. Tian
N. Al-Nabhan
15
19
0
20 Sep 2020
Introspective Learning by Distilling Knowledge from Online Self-explanation
Jindong Gu
Zhiliang Wu
Volker Tresp
17
3
0
19 Sep 2020
Holistic Grid Fusion Based Stop Line Estimation
Runsheng Xu
Faezeh Tafazzoli
Li Zhang
Timo Rehfeld
Gunther Krehl
Arunava Seal
12
13
0
18 Sep 2020
Densely Guided Knowledge Distillation using Multiple Teacher Assistants
Wonchul Son
Jaemin Na
Junyong Choi
Wonjun Hwang
27
111
0
18 Sep 2020
S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning
Karsten Roth
Timo Milbich
Bjorn Ommer
Joseph Paul Cohen
Marzyeh Ghassemi
FedML
28
17
0
17 Sep 2020
Collaborative Group Learning
Shaoxiong Feng
Hongshen Chen
Xuancheng Ren
Zhuoye Ding
Kan Li
Xu Sun
20
7
0
16 Sep 2020
Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP
Hao Fei
Yafeng Ren
Donghong Ji
30
24
0
16 Sep 2020
Noisy Self-Knowledge Distillation for Text Summarization
Yang Liu
S. Shen
Mirella Lapata
33
44
0
15 Sep 2020
Collaborative Distillation in the Parameter and Spectrum Domains for Video Action Recognition
Haisheng Su
Jing Su
Dongliang Wang
Weihao Gan
Wei Wu
Mengmeng Wang
Junjie Yan
Yu Qiao
14
7
0
15 Sep 2020
Decoupling Representation Learning from Reinforcement Learning
Adam Stooke
Kimin Lee
Pieter Abbeel
Michael Laskin
SSL
DRL
288
341
0
14 Sep 2020
Diversified Mutual Learning for Deep Metric Learning
Wonpyo Park
Wonjae Kim
Kihyun You
Minsu Cho
FedML
36
6
0
09 Sep 2020
On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective
Seonguk Park
Kiyoon Yoo
Nojun Kwak
FedML
26
3
0
09 Sep 2020
Intra-Utterance Similarity Preserving Knowledge Distillation for Audio Tagging
Chun-Chieh Chang
Chieh-Chi Kao
Ming Sun
Chao Wang
18
5
0
03 Sep 2020
Semantics-aware Adaptive Knowledge Distillation for Sensor-to-Vision Action Recognition
Yang Liu
Keze Wang
Guanbin Li
Liang Lin
29
87
0
01 Sep 2020
Evaluating Knowledge Transfer in Neural Network for Medical Images
Sina Akbarian
Laleh Seyyed-Kalantari
Farzad Khalvati
Elham Dolatabadi
20
16
0
31 Aug 2020
Self-supervised Video Representation Learning by Uncovering Spatio-temporal Statistics
Jiangliu Wang
Jianbo Jiao
Linchao Bao
Shengfeng He
Wei Liu
Yunhui Liu
SSL
AI4TS
21
55
0
31 Aug 2020
MetaDistiller: Network Self-Boosting via Meta-Learned Top-Down Distillation
Benlin Liu
Yongming Rao
Jiwen Lu
Jie Zhou
Cho-Jui Hsieh
22
37
0
27 Aug 2020
New Directions in Distributed Deep Learning: Bringing the Network at Forefront of IoT Design
Kartikeya Bhardwaj
Wei Chen
R. Marculescu
GNN
22
7
0
25 Aug 2020
Matching Guided Distillation
Kaiyu Yue
Jiangfan Deng
Feng Zhou
22
49
0
23 Aug 2020
Self-Supervised Ultrasound to MRI Fetal Brain Image Synthesis
Jianbo Jiao
A. Namburete
A. Papageorghiou
J. A. Noble
MedIm
30
33
0
19 Aug 2020
Restructuring, Pruning, and Adjustment of Deep Models for Parallel Distributed Inference
Afshin Abdi
Saeed Rashidi
Faramarz Fekri
T. Krishna
25
7
0
19 Aug 2020
Previous
1
2
3
...
17
18
19
...
22
23
24
Next