ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.01775
  4. Cited By
Feature-map-level Online Adversarial Knowledge Distillation

Feature-map-level Online Adversarial Knowledge Distillation

5 February 2020
Inseop Chung
Seonguk Park
Jangho Kim
Nojun Kwak
    GAN
ArXivPDFHTML

Papers citing "Feature-map-level Online Adversarial Knowledge Distillation"

25 / 25 papers shown
Title
Multi-teacher Distillation for Multilingual Spelling Correction
Multi-teacher Distillation for Multilingual Spelling Correction
Jingfen Zhang
Xuan Guo
S. Bodapati
Christopher Potts
KELM
27
3
0
20 Nov 2023
A Transformer-Based Model With Self-Distillation for Multimodal Emotion
  Recognition in Conversations
A Transformer-Based Model With Self-Distillation for Multimodal Emotion Recognition in Conversations
Hui Ma
Jian Wang
Hongfei Lin
Bo Zhang
Yijia Zhang
Bo Xu
23
40
0
31 Oct 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
Distilling Universal and Joint Knowledge for Cross-Domain Model
  Compression on Time Series Data
Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data
Qing Xu
Min-man Wu
Xiaoli Li
K. Mao
Zhenghua Chen
19
5
0
07 Jul 2023
Performance-aware Approximation of Global Channel Pruning for Multitask
  CNNs
Performance-aware Approximation of Global Channel Pruning for Multitask CNNs
Hancheng Ye
Bo-Wen Zhang
Tao Chen
Jiayuan Fan
Bin Wang
32
18
0
21 Mar 2023
BD-KD: Balancing the Divergences for Online Knowledge Distillation
BD-KD: Balancing the Divergences for Online Knowledge Distillation
Ibtihel Amara
N. Sepahvand
B. Meyer
W. Gross
J. Clark
29
2
0
25 Dec 2022
Lightning Fast Video Anomaly Detection via Adversarial Knowledge
  Distillation
Lightning Fast Video Anomaly Detection via Adversarial Knowledge Distillation
Florinel-Alin Croitoru
Nicolae-Cătălin Ristea
D. Dascalescu
Radu Tudor Ionescu
Fahad Shahbaz Khan
M. Shah
43
2
0
28 Nov 2022
AI-KD: Adversarial learning and Implicit regularization for
  self-Knowledge Distillation
AI-KD: Adversarial learning and Implicit regularization for self-Knowledge Distillation
Hyungmin Kim
Sungho Suh
Sunghyun Baek
Daehwan Kim
Daun Jeong
Hansang Cho
Junmo Kim
30
5
0
20 Nov 2022
SADT: Combining Sharpness-Aware Minimization with Self-Distillation for
  Improved Model Generalization
SADT: Combining Sharpness-Aware Minimization with Self-Distillation for Improved Model Generalization
Masud An Nur Islam Fahim
Jani Boutellier
40
0
0
01 Nov 2022
Teacher-Student Architecture for Knowledge Learning: A Survey
Teacher-Student Architecture for Knowledge Learning: A Survey
Chengming Hu
Xuan Li
Dan Liu
Xi Chen
Ju Wang
Xue Liu
20
35
0
28 Oct 2022
Bi-directional Weakly Supervised Knowledge Distillation for Whole Slide
  Image Classification
Bi-directional Weakly Supervised Knowledge Distillation for Whole Slide Image Classification
Linhao Qu
Xiao-Zhuo Luo
Manning Wang
Zhijian Song
WSOD
26
59
0
07 Oct 2022
Dense Depth Distillation with Out-of-Distribution Simulated Images
Dense Depth Distillation with Out-of-Distribution Simulated Images
Junjie Hu
Chenyou Fan
Mete Ozay
Hualie Jiang
Tin Lun Lam
24
4
0
26 Aug 2022
Multi-domain Learning for Updating Face Anti-spoofing Models
Multi-domain Learning for Updating Face Anti-spoofing Models
Xiao Guo
Yaojie Liu
Anil Jain
Xiaoming Liu
CLL
CVBM
33
30
0
23 Aug 2022
Online Knowledge Distillation via Mutual Contrastive Learning for Visual
  Recognition
Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition
Chuanguang Yang
Zhulin An
Helong Zhou
Fuzhen Zhuang
Yongjun Xu
Qian Zhang
41
50
0
23 Jul 2022
Improved Knowledge Distillation via Adversarial Collaboration
Improved Knowledge Distillation via Adversarial Collaboration
Zhiqiang Liu
Chengkai Huang
Yanxia Liu
29
2
0
29 Nov 2021
Local-Selective Feature Distillation for Single Image Super-Resolution
Local-Selective Feature Distillation for Single Image Super-Resolution
Seonguk Park
Nojun Kwak
24
9
0
22 Nov 2021
Knowledge Distillation Using Hierarchical Self-Supervision Augmented
  Distribution
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
22
15
0
07 Sep 2021
Distilling a Powerful Student Model via Online Knowledge Distillation
Distilling a Powerful Student Model via Online Knowledge Distillation
Shaojie Li
Mingbao Lin
Yan Wang
Yongjian Wu
Yonghong Tian
Ling Shao
Rongrong Ji
FedML
27
46
0
26 Mar 2021
Student Network Learning via Evolutionary Knowledge Distillation
Student Network Learning via Evolutionary Knowledge Distillation
Kangkai Zhang
Chunhui Zhang
Shikun Li
Dan Zeng
Shiming Ge
22
83
0
23 Mar 2021
Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge
  Distillation
Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation
Mingi Ji
Seungjae Shin
Seunghyun Hwang
Gibeom Park
Il-Chul Moon
13
120
0
15 Mar 2021
DICE: Diversity in Deep Ensembles via Conditional Redundancy Adversarial
  Estimation
DICE: Diversity in Deep Ensembles via Conditional Redundancy Adversarial Estimation
Alexandre Ramé
Matthieu Cord
FedML
53
51
0
14 Jan 2021
Bringing AI To Edge: From Deep Learning's Perspective
Bringing AI To Edge: From Deep Learning's Perspective
Di Liu
Hao Kong
Xiangzhong Luo
Weichen Liu
Ravi Subramaniam
52
116
0
25 Nov 2020
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,843
0
09 Jun 2020
Feature Fusion for Online Mutual Knowledge Distillation
Feature Fusion for Online Mutual Knowledge Distillation
Jangho Kim
Minsung Hyun
Inseop Chung
Nojun Kwak
FedML
26
91
0
19 Apr 2019
Knowledge Distillation by On-the-Fly Native Ensemble
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
209
474
0
12 Jun 2018
1