Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2101.07308
Cited By
Knowledge Distillation Methods for Efficient Unsupervised Adaptation Across Multiple Domains
18 January 2021
Le Thanh Nguyen-Meidine
Atif Belal
M. Kiran
Jose Dolz
Louis-Antoine Blais-Morin
Eric Granger
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation Methods for Efficient Unsupervised Adaptation Across Multiple Domains"
6 / 6 papers shown
Title
Knowledge Distillation for Multi-Target Domain Adaptation in Real-Time Person Re-Identification
Félix Remigereau
Djebril Mekhazni
Sajjad Abdoli
Le Thanh Nguyen-Meidine
Rafael M. O. Cruz
Eric Granger
30
9
0
12 May 2022
Co-Teaching for Unsupervised Domain Adaptation and Expansion
Kaibin Tian
Qijie Wei
Xirong Li
29
1
0
04 Apr 2022
Pre-Training Transformers for Domain Adaptation
Burhan Ul Tayyab
Nicholas Chua
ViT
24
2
0
18 Dec 2021
FReTAL: Generalizing Deepfake Detection using Knowledge Distillation and Representation Learning
Minhan Kim
Shahroz Tariq
Simon S. Woo
26
86
0
28 May 2021
Adaptive Pseudo-Label Refinement by Negative Ensemble Learning for Source-Free Unsupervised Domain Adaptation
Waqar Ahmed
Pietro Morerio
Vittorio Murino
16
4
0
29 Mar 2021
Unsupervised Domain Adaptation in the Dissimilarity Space for Person Re-identification
Djebril Mekhazni
Amran Bhuiyan
G. Ekladious
Eric Granger
OOD
86
93
0
27 Jul 2020
1