Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1904.05068
Cited By
v1
v2 (latest)
Relational Knowledge Distillation
10 April 2019
Wonpyo Park
Dongju Kim
Yan Lu
Minsu Cho
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Relational Knowledge Distillation"
9 / 59 papers shown
Title
Deep Metric Learning via Lifted Structured Feature Embedding
Hyun Oh Song
Yu Xiang
Stefanie Jegelka
Silvio Savarese
FedML
SSL
DML
97
1,645
0
19 Nov 2015
Unifying distillation and privileged information
David Lopez-Paz
Léon Bottou
Bernhard Schölkopf
V. Vapnik
FedML
171
463
0
11 Nov 2015
FaceNet: A Unified Embedding for Face Recognition and Clustering
Florian Schroff
Dmitry Kalenichenko
James Philbin
3DH
389
13,145
0
12 Mar 2015
Distilling the Knowledge in a Neural Network
Geoffrey E. Hinton
Oriol Vinyals
J. Dean
FedML
367
19,745
0
09 Mar 2015
FitNets: Hints for Thin Deep Nets
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
319
3,899
0
19 Dec 2014
Going Deeper with Convolutions
Christian Szegedy
Wei Liu
Yangqing Jia
P. Sermanet
Scott E. Reed
Dragomir Anguelov
D. Erhan
Vincent Vanhoucke
Andrew Rabinovich
494
43,698
0
17 Sep 2014
Very Deep Convolutional Networks for Large-Scale Image Recognition
Karen Simonyan
Andrew Zisserman
FAtt
MDE
1.7K
100,529
0
04 Sep 2014
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
1.7K
39,615
0
01 Sep 2014
Do Deep Nets Really Need to be Deep?
Lei Jimmy Ba
R. Caruana
173
2,119
0
21 Dec 2013
Previous
1
2