Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2009.09155
Cited By
SecDD: Efficient and Secure Method for Remotely Training Neural Networks
19 September 2020
Ilia Sucholutsky
Matthias Schonlau
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"SecDD: Efficient and Secure Method for Remotely Training Neural Networks"
9 / 9 papers shown
Title
'Less Than One'-Shot Learning: Learning N Classes From M<N Samples
Ilia Sucholutsky
Matthias Schonlau
VLM
103
43
0
17 Sep 2020
Flexible Dataset Distillation: Learn Labels Instead of Images
Ondrej Bohdal
Yongxin Yang
Timothy M. Hospedales
DD
83
110
0
15 Jun 2020
Dataset Condensation with Gradient Matching
Bo Zhao
Konda Reddy Mopuri
Hakan Bilen
DD
113
497
0
10 Jun 2020
Optimizing Millions of Hyperparameters by Implicit Differentiation
Jonathan Lorraine
Paul Vicol
David Duvenaud
DD
116
415
0
06 Nov 2019
Soft-Label Dataset Distillation and Text Dataset Distillation
Ilia Sucholutsky
Matthias Schonlau
DD
134
135
0
06 Oct 2019
Using Small Proxy Datasets to Accelerate Hyperparameter Search
Sam Shleifer
Eric Prokop
DD
39
22
0
12 Jun 2019
Energy and Policy Considerations for Deep Learning in NLP
Emma Strubell
Ananya Ganesh
Andrew McCallum
69
2,657
0
05 Jun 2019
Hyperspherical Prototype Networks
Pascal Mettes
Elise van der Pol
Cees G. M. Snoek
80
125
0
29 Jan 2019
Dataset Distillation
Tongzhou Wang
Jun-Yan Zhu
Antonio Torralba
Alexei A. Efros
DD
78
295
0
27 Nov 2018
1