ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2007.12355
  4. Cited By
Dynamic Knowledge Distillation for Black-box Hypothesis Transfer
  Learning

Dynamic Knowledge Distillation for Black-box Hypothesis Transfer Learning

24 July 2020
Yiqin Yu
Xu Min
Shiwan Zhao
Jing Mei
Fei Wang
Dongsheng Li
Kenney Ng
Shaochun Li
ArXivPDFHTML

Papers citing "Dynamic Knowledge Distillation for Black-box Hypothesis Transfer Learning"

5 / 5 papers shown
Title
A Comprehensive Survey on Transfer Learning
A Comprehensive Survey on Transfer Learning
Fuzhen Zhuang
Zhiyuan Qi
Keyu Duan
Dongbo Xi
Yongchun Zhu
Hengshu Zhu
Hui Xiong
Qing He
175
4,428
0
07 Nov 2019
Zero-Shot Knowledge Distillation in Deep Networks
Zero-Shot Knowledge Distillation in Deep Networks
Gaurav Kumar Nayak
Konda Reddy Mopuri
Vaisakh Shaj
R. Venkatesh Babu
Anirban Chakraborty
73
245
0
20 May 2019
Multitask learning and benchmarking with clinical time series data
Multitask learning and benchmarking with clinical time series data
Hrayr Harutyunyan
Hrant Khachatrian
David C. Kale
Greg Ver Steeg
Aram Galstyan
OOD
AI4TS
155
874
0
22 Mar 2017
Hypothesis Transfer Learning via Transformation Functions
Hypothesis Transfer Learning via Transformation Functions
S. Du
Jayanth Koushik
Aarti Singh
Barnabás Póczós
51
58
0
03 Dec 2016
FitNets: Hints for Thin Deep Nets
FitNets: Hints for Thin Deep Nets
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
276
3,870
0
19 Dec 2014
1