Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2010.09923
Cited By
Anti-Distillation: Improving reproducibility of deep networks
19 October 2020
G. Shamir
Lorenzo Coviello
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Anti-Distillation: Improving reproducibility of deep networks"
10 / 10 papers shown
Title
A Tale of Two Imperatives: Privacy and Explainability
Supriya Manna
Niladri Sett
91
0
0
30 Dec 2024
Optimal Guarantees for Algorithmic Reproducibility and Gradient Complexity in Convex Optimization
Liang Zhang
Junchi Yang
Amin Karbasi
Niao He
24
2
0
26 Oct 2023
Similarity of Neural Network Models: A Survey of Functional and Representational Measures
Max Klabunde
Tobias Schumacher
M. Strohmaier
Florian Lemmerich
52
64
0
10 May 2023
On the Factory Floor: ML Engineering for Industrial-Scale Ads Recommendation Models
Rohan Anil
S. Gadanho
Danya Huang
Nijith Jacob
Zhuoshu Li
...
Cristina Pop
Kevin Regan
G. Shamir
Rakesh Shivanna
Qiqi Yan
3DV
16
41
0
12 Sep 2022
On the Prediction Instability of Graph Neural Networks
Max Klabunde
Florian Lemmerich
40
5
0
20 May 2022
Reducing Model Jitter: Stable Re-training of Semantic Parsers in Production Environments
Christopher Hidey
Fei Liu
Rahul Goel
21
4
0
10 Apr 2022
Randomness In Neural Network Training: Characterizing The Impact of Tooling
Donglin Zhuang
Xingyao Zhang
S. Song
Sara Hooker
25
75
0
22 Jun 2021
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
192
473
0
12 Jun 2018
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
272
404
0
09 Apr 2018
Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles
Balaji Lakshminarayanan
Alexander Pritzel
Charles Blundell
UQCV
BDL
270
5,660
0
05 Dec 2016
1