Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2007.12355
Cited By
Dynamic Knowledge Distillation for Black-box Hypothesis Transfer Learning
24 July 2020
Yiqin Yu
Xu Min
Shiwan Zhao
Jing Mei
Fei Wang
Dongsheng Li
Kenney Ng
Shaochun Li
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Dynamic Knowledge Distillation for Black-box Hypothesis Transfer Learning"
5 / 5 papers shown
Title
A Comprehensive Survey on Transfer Learning
Fuzhen Zhuang
Zhiyuan Qi
Keyu Duan
Dongbo Xi
Yongchun Zhu
Hengshu Zhu
Hui Xiong
Qing He
175
4,428
0
07 Nov 2019
Zero-Shot Knowledge Distillation in Deep Networks
Gaurav Kumar Nayak
Konda Reddy Mopuri
Vaisakh Shaj
R. Venkatesh Babu
Anirban Chakraborty
73
245
0
20 May 2019
Multitask learning and benchmarking with clinical time series data
Hrayr Harutyunyan
Hrant Khachatrian
David C. Kale
Greg Ver Steeg
Aram Galstyan
OOD
AI4TS
155
874
0
22 Mar 2017
Hypothesis Transfer Learning via Transformation Functions
S. Du
Jayanth Koushik
Aarti Singh
Barnabás Póczós
51
58
0
03 Dec 2016
FitNets: Hints for Thin Deep Nets
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
276
3,870
0
19 Dec 2014
1