Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1709.02929
Cited By
Model Distillation with Knowledge Transfer from Face Classification to Alignment and Verification
9 September 2017
Chong-Jun Wang
Xipeng Lan
Yang Zhang
CVBM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Model Distillation with Knowledge Transfer from Face Classification to Alignment and Verification"
7 / 7 papers shown
Title
Distilling interpretable causal trees from causal forests
Patrick Rehill
CML
33
0
0
02 Aug 2024
Knowledge Distillation as Semiparametric Inference
Tri Dao
G. Kamath
Vasilis Syrgkanis
Lester W. Mackey
42
31
0
20 Apr 2021
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
33
2,872
0
09 Jun 2020
Multitask Emotion Recognition with Incomplete Labels
Didan Deng
Zhaokang Chen
Bertram E. Shi
CVBM
21
94
0
10 Feb 2020
Frosting Weights for Better Continual Training
Xiaofeng Zhu
Feng Liu
Goce Trajcevski
Dingding Wang
FedML
AI4CE
CLL
28
5
0
07 Jan 2020
Factorized Distillation: Training Holistic Person Re-identification Model by Distilling an Ensemble of Partial ReID Models
Pengyuan Ren
Jianmin Li
25
9
0
20 Nov 2018
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary
Byeongho Heo
Minsik Lee
Sangdoo Yun
J. Choi
AAML
35
146
0
15 May 2018
1