Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2403.07030
Cited By
AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation
11 March 2024
Zihao Tang
Zheqi Lv
Shengyu Zhang
Yifan Zhou
Xinyu Duan
Leilei Gan
Kun Kuang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation"
23 / 23 papers shown
Title
MetaCoCo: A New Few-Shot Classification Benchmark with Spurious Correlation
Min Zhang
Haoxuan Li
Leilei Gan
Kun Kuang
OODD
71
12
0
30 Apr 2024
ModelGPT: Unleashing LLM's Capabilities for Tailored Model Generation
Zihao Tang
Zheqi Lv
Shengyu Zhang
Leilei Gan
Kun Kuang
80
8
0
18 Feb 2024
C-SFDA: A Curriculum Learning Aided Self-Training Framework for Efficient Source Free Domain Adaptation
Nazmul Karim
Niluthpol Chowdhury Mithun
Abhinav Rajvanshi
Han-Pang Chiu
S. Samarasekera
Nazanin Rahnavard
TTA
65
57
0
30 Mar 2023
Concurrent Subsidiary Supervision for Unsupervised Source-Free Domain Adaptation
Jogendra Nath Kundu
Suvaansh Bhambri
Akshay Ravindra Kulkarni
Hiran Sarkar
Varun Jampani
R. Venkatesh Babu
56
27
0
27 Jul 2022
Intelligent Request Strategy Design in Recommender System
Xufeng Qian
Yue Xu
Fuyu Lv
Shengyu Zhang
Ziwen Jiang
Qingwen Liu
Xiaoyi Zeng
Tat-Seng Chua
Leilei Gan
52
16
0
23 Jun 2022
Source-Free Domain Adaptation via Distribution Estimation
Ning Ding
Yixing Xu
Yehui Tang
Chao Xu
Yunhe Wang
Dacheng Tao
TTA
44
115
0
24 Apr 2022
Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data
Gongfan Fang
Yifan Bao
Mingli Song
Xinchao Wang
Don Xie
Chengchao Shen
Xiuming Zhang
59
44
0
27 Oct 2021
Data-Free Model Extraction
Jean-Baptiste Truong
Pratyush Maini
R. Walls
Nicolas Papernot
MIACV
65
188
0
30 Nov 2020
Environment Inference for Invariant Learning
Elliot Creager
J. Jacobsen
R. Zemel
OOD
57
382
0
14 Oct 2020
Energy-based Out-of-distribution Detection
Weitang Liu
Xiaoyun Wang
John Douglas Owens
Yixuan Li
OODD
255
1,349
0
08 Oct 2020
Data-Free Network Quantization With Adversarial Knowledge Distillation
Yoojin Choi
Jihwan P. Choi
Mostafa El-Khamy
Jungwon Lee
MQ
51
120
0
08 May 2020
Do We Really Need to Access the Source Data? Source Hypothesis Transfer for Unsupervised Domain Adaptation
Jian Liang
Dapeng Hu
Jiashi Feng
95
1,238
0
20 Feb 2020
Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion
Hongxu Yin
Pavlo Molchanov
Zhizhong Li
J. Álvarez
Arun Mallya
Derek Hoiem
N. Jha
Jan Kautz
62
563
0
18 Dec 2019
EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
Mingxing Tan
Quoc V. Le
3DV
MedIm
129
18,058
0
28 May 2019
Zero-shot Knowledge Transfer via Adversarial Belief Matching
P. Micaelli
Amos Storkey
46
229
0
23 May 2019
Searching for MobileNetV3
Andrew G. Howard
Mark Sandler
Grace Chu
Liang-Chieh Chen
Bo Chen
...
Yukun Zhu
Ruoming Pang
Vijay Vasudevan
Quoc V. Le
Hartwig Adam
323
6,737
0
06 May 2019
Relational Knowledge Distillation
Wonpyo Park
Dongju Kim
Yan Lu
Minsu Cho
63
1,405
0
10 Apr 2019
Dynamic Curriculum Learning for Imbalanced Data Classification
Yiru Wang
Weihao Gan
Jie Yang
Wei Wu
Junjie Yan
61
218
0
21 Jan 2019
ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design
Ningning Ma
Xiangyu Zhang
Haitao Zheng
Jian Sun
159
4,970
0
30 Jul 2018
Stable Prediction across Unknown Environments
Kun Kuang
Ruoxuan Xiong
Peng Cui
Susan Athey
Bo Li
OOD
71
167
0
16 Jun 2018
Deep Hashing Network for Unsupervised Domain Adaptation
Hemanth Venkateswara
José Eusébio
Shayok Chakraborty
S. Panchanathan
OOD
138
2,041
0
22 Jun 2017
FitNets: Hints for Thin Deep Nets
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
278
3,870
0
19 Dec 2014
Microsoft COCO: Common Objects in Context
Nayeon Lee
Michael Maire
Serge J. Belongie
Lubomir Bourdev
Ross B. Girshick
James Hays
Pietro Perona
Deva Ramanan
C. L. Zitnick
Piotr Dollár
ObjD
373
43,524
0
01 May 2014
1