Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1904.05835
Cited By
Variational Information Distillation for Knowledge Transfer
11 April 2019
Sungsoo Ahn
S. Hu
Andreas C. Damianou
Neil D. Lawrence
Zhenwen Dai
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Variational Information Distillation for Knowledge Transfer"
50 / 321 papers shown
Title
Improve Knowledge Distillation via Label Revision and Data Selection
Weichao Lan
Yiu-ming Cheung
Qing Xu
Buhua Liu
Zhikai Hu
Mengke Li
Zhenghua Chen
37
2
0
03 Apr 2024
Diffusion Deepfake
Chaitali Bhattacharyya
Hanxiao Wang
Feng Zhang
Sung-Ha Kim
Xiatian Zhu
32
5
0
02 Apr 2024
LNPT: Label-free Network Pruning and Training
Jinying Xiao
Ping Li
Zhe Tang
Jie Nie
38
2
0
19 Mar 2024
Self-Supervised Quantization-Aware Knowledge Distillation
Kaiqi Zhao
Ming Zhao
MQ
38
2
0
17 Mar 2024
Don't Judge by the Look: Towards Motion Coherent Video Representation
Yitian Zhang
Yue Bai
Huan Wang
Yizhou Wang
Yun Fu
35
0
0
14 Mar 2024
PYRA: Parallel Yielding Re-Activation for Training-Inference Efficient Task Adaptation
Yizhe Xiong
Hui Chen
Tianxiang Hao
Zijia Lin
Jungong Han
Yuesong Zhang
Guoxin Wang
Yongjun Bao
Guiguang Ding
51
17
0
14 Mar 2024
LIX: Implicitly Infusing Spatial Geometric Prior Knowledge into Visual Semantic Segmentation for Autonomous Driving
Sicen Guo
Zhiyuan Wu
Qijun Chen
Ioannis Pitas
Rui Fan
Rui Fan
40
1
0
13 Mar 2024
Distilling the Knowledge in Data Pruning
Emanuel Ben-Baruch
Adam Botach
Igor Kviatkovsky
Manoj Aggarwal
Gérard Medioni
38
1
0
12 Mar 2024
Frequency Attention for Knowledge Distillation
Cuong Pham
Van-Anh Nguyen
Trung Le
Dinh Q. Phung
Gustavo Carneiro
Thanh-Toan Do
32
16
0
09 Mar 2024
Logit Standardization in Knowledge Distillation
Shangquan Sun
Wenqi Ren
Jingzhi Li
Rui Wang
Xiaochun Cao
37
59
0
03 Mar 2024
GraphKD: Exploring Knowledge Distillation Towards Document Object Detection with Structured Graph Creation
Ayan Banerjee
Sanket Biswas
Josep Lladós
Umapada Pal
46
1
0
17 Feb 2024
Knowledge Distillation Based on Transformed Teacher Matching
Kaixiang Zheng
En-Hui Yang
32
19
0
17 Feb 2024
FedD2S: Personalized Data-Free Federated Knowledge Distillation
Kawa Atapour
S. J. Seyedmohammadi
J. Abouei
Arash Mohammadi
Konstantinos N. Plataniotis
FedML
35
2
0
16 Feb 2024
Closed-Loop Unsupervised Representation Disentanglement with
β
β
β
-VAE Distillation and Diffusion Probabilistic Feedback
Xin Jin
Bo Li
Baao Xie
Wenyao Zhang
Jinming Liu
Ziqiang Li
Tao Yang
Wenjun Zeng
DRL
DiffM
CoGe
39
7
0
04 Feb 2024
Precise Knowledge Transfer via Flow Matching
Shitong Shao
Zhiqiang Shen
Linrui Gong
Huanran Chen
Xu Dai
34
2
0
03 Feb 2024
Rethinking Centered Kernel Alignment in Knowledge Distillation
Zikai Zhou
Yunhang Shen
Shitong Shao
Linrui Gong
Shaohui Lin
24
1
0
22 Jan 2024
Bayes Conditional Distribution Estimation for Knowledge Distillation Based on Conditional Mutual Information
Linfeng Ye
Shayan Mohajer Hamidi
Renhao Tan
En-Hui Yang
VLM
37
14
0
16 Jan 2024
Convolutional Neural Network Compression via Dynamic Parameter Rank Pruning
Manish Sharma
Jamison Heard
Eli Saber
Panos P. Markopoulos
31
1
0
15 Jan 2024
Direct Distillation between Different Domains
Jialiang Tang
Shuo Chen
Gang Niu
Hongyuan Zhu
Qiufeng Wang
Chen Gong
Masashi Sugiyama
60
3
0
12 Jan 2024
One-Shot Multi-Rate Pruning of Graph Convolutional Networks
H. Sahbi
33
0
0
29 Dec 2023
Revisiting Knowledge Distillation under Distribution Shift
Songming Zhang
Ziyu Lyu
Xiaofeng Chen
32
1
0
25 Dec 2023
AM-RADIO: Agglomerative Vision Foundation Model -- Reduce All Domains Into One
Michael Ranzinger
Greg Heinrich
Jan Kautz
Pavlo Molchanov
VLM
44
42
0
10 Dec 2023
Regressor-Segmenter Mutual Prompt Learning for Crowd Counting
Mingyue Guo
Li Yuan
Zhaoyi Yan
Binghui Chen
Yaowei Wang
QiXiang Ye
38
4
0
04 Dec 2023
Maximizing Discrimination Capability of Knowledge Distillation with Energy Function
Seonghak Kim
Gyeongdo Ham
Suin Lee
Donggon Jang
Daeshik Kim
34
4
0
24 Nov 2023
Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning
Seonghak Kim
Gyeongdo Ham
Yucheol Cho
Daeshik Kim
30
3
0
23 Nov 2023
Comparative Knowledge Distillation
Alex Wilf
Alex Tianyi Xu
Paul Pu Liang
A. Obolenskiy
Daniel Fried
Louis-Philippe Morency
VLM
23
1
0
03 Nov 2023
Improving Knowledge Distillation with Teacher's Explanation
S. Chowdhury
Ben Liang
A. Tizghadam
Ilijc Albanese
FAtt
19
0
0
04 Oct 2023
Deep Model Fusion: A Survey
Weishi Li
Yong Peng
Miao Zhang
Liang Ding
Han Hu
Li Shen
FedML
MoMe
43
52
0
27 Sep 2023
Inherit with Distillation and Evolve with Contrast: Exploring Class Incremental Semantic Segmentation Without Exemplar Memory
Danpei Zhao
Bo Yuan
Z. Shi
VLM
CLL
33
9
0
27 Sep 2023
Knowledge Distillation Layer that Lets the Student Decide
Ada Gorgun
Y. Z. Gürbüz
A. Aydin Alatan
29
0
0
06 Sep 2023
Bridging Cross-task Protocol Inconsistency for Distillation in Dense Object Detection
Longrong Yang
Xianpan Zhou
Xuewei Li
Liang Qiao
Zheyang Li
Zi-Liang Yang
Gaoang Wang
Xi Li
21
17
0
28 Aug 2023
QD-BEV : Quantization-aware View-guided Distillation for Multi-view 3D Object Detection
Yifan Zhang
Zhen Dong
Huanrui Yang
Ming Lu
Cheng-Ching Tseng
Yuan Du
Kurt Keutzer
Li Du
Shanghang Zhang
MQ
34
9
0
21 Aug 2023
Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation
Shengcao Cao
Mengtian Li
James Hays
Deva Ramanan
Yu-xiong Wang
Liangyan Gui
VLM
26
11
0
17 Aug 2023
SRMAE: Masked Image Modeling for Scale-Invariant Deep Representations
Zhiming Wang
Lin Gu
Feng Lu
32
0
0
17 Aug 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
Class-relation Knowledge Distillation for Novel Class Discovery
Peiyan Gu
Chuyu Zhang
Rui Xu
Xuming He
34
16
0
18 Jul 2023
Frameless Graph Knowledge Distillation
Dai Shi
Zhiqi Shao
Yi Guo
Junbin Gao
39
4
0
13 Jul 2023
Review helps learn better: Temporal Supervised Knowledge Distillation
Dongwei Wang
Zhi Han
Yanmei Wang
Xi’ai Chen
Baichen Liu
Yandong Tang
60
1
0
03 Jul 2023
Miniaturized Graph Convolutional Networks with Topologically Consistent Pruning
H. Sahbi
28
0
0
30 Jun 2023
Streaming egocentric action anticipation: An evaluation scheme and approach
Antonino Furnari
G. Farinella
EgoV
21
3
0
29 Jun 2023
A Dimensional Structure based Knowledge Distillation Method for Cross-Modal Learning
Hui Xiong
Hongwei Dong
Wenwen Qiang
J. Yu
Wen-jie Zhai
Changwen Zheng
Fanjiang Xu
Gang Hua
24
1
0
28 Jun 2023
Enhancing Mapless Trajectory Prediction through Knowledge Distillation
Yuning Wang
Pu Zhang
Lei Bai
Jianru Xue
35
4
0
25 Jun 2023
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLM
OffRL
86
22
0
19 Jun 2023
Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning
Hailin Zhang
Defang Chen
Can Wang
20
12
0
11 Jun 2023
Budget-Aware Graph Convolutional Network Design using Probabilistic Magnitude Pruning
H. Sahbi
21
0
0
30 May 2023
Learning to Learn from APIs: Black-Box Data-Free Meta-Learning
Zixuan Hu
Li Shen
Zhenyi Wang
Baoyuan Wu
Chun Yuan
Dacheng Tao
49
7
0
28 May 2023
Triplet Knowledge Distillation
Xijun Wang
Dongyang Liu
Meina Kan
Chunrui Han
Zhongqin Wu
Shiguang Shan
37
3
0
25 May 2023
Knowledge Diffusion for Distillation
Tao Huang
Yuan Zhang
Mingkai Zheng
Shan You
Fei Wang
Chao Qian
Chang Xu
37
51
0
25 May 2023
NORM: Knowledge Distillation via N-to-One Representation Matching
Xiaolong Liu
Lujun Li
Chao Li
Anbang Yao
55
68
0
23 May 2023
Robust Saliency-Aware Distillation for Few-shot Fine-grained Visual Recognition
Haiqi Liu
Chong Chen
Xinrong Gong
Tong Zhang
40
9
0
12 May 2023
Previous
1
2
3
4
5
6
7
Next