Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2401.06826
Cited By
Direct Distillation between Different Domains
12 January 2024
Jialiang Tang
Shuo Chen
Gang Niu
Hongyuan Zhu
Joey Tianyi Zhou
Chen Gong
Masashi Sugiyama
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Direct Distillation between Different Domains"
9 / 9 papers shown
Title
RFMI: Estimating Mutual Information on Rectified Flow for Text-to-Image Alignment
Chao Wang
Giulio Franzese
A. Finamore
Pietro Michiardi
64
0
0
18 Mar 2025
Domain-invariant Progressive Knowledge Distillation for UAV-based Object Detection
Liang Yao
Fan Liu
Chuanyi Zhang
Zhiquan Ou
Ting Wu
VLM
39
4
0
21 Aug 2024
Decompose, Adjust, Compose: Effective Normalization by Playing with Frequency for Domain Generalization
Sangrok Lee
Jongseong Bae
Ha Young Kim
OOD
33
26
0
04 Mar 2023
HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression
Chenhe Dong
Yaliang Li
Ying Shen
Minghui Qiu
VLM
32
7
0
16 Oct 2021
Model Adaptation: Historical Contrastive Learning for Unsupervised Domain Adaptation without Source Data
Jiaxing Huang
Dayan Guan
Aoran Xiao
Shijian Lu
153
212
0
07 Oct 2021
Cross-Domain Adaptive Clustering for Semi-Supervised Domain Adaptation
Jichang Li
Guanbin Li
Yemin Shi
Yizhou Yu
66
120
0
19 Apr 2021
Learning Student-Friendly Teacher Networks for Knowledge Distillation
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
118
100
0
12 Feb 2021
Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer
Jian Liang
Dapeng Hu
Yunbo Wang
Ran He
Jiashi Feng
148
250
0
14 Dec 2020
Deep Domain-Adversarial Image Generation for Domain Generalisation
Kaiyang Zhou
Yongxin Yang
Timothy M. Hospedales
Tao Xiang
OOD
215
404
0
12 Mar 2020
1