Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1809.10562
Cited By
Dropout Distillation for Efficiently Estimating Model Confidence
27 September 2018
Corina Gurau
Alex Bewley
Ingmar Posner
BDL
UQCV
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Dropout Distillation for Efficiently Estimating Model Confidence"
7 / 7 papers shown
Title
Efficient Multi-task Uncertainties for Joint Semantic Segmentation and Monocular Depth Estimation
S. Landgraf
Markus Hillemann
Theodor Kapler
Markus Ulrich
UQCV
39
8
0
16 Feb 2024
Robust Models are less Over-Confident
Julia Grabinski
Paul Gavrikov
J. Keuper
M. Keuper
AAML
36
24
0
12 Oct 2022
Confidence-Aware Learning for Deep Neural Networks
J. Moon
Jihyo Kim
Younghak Shin
Sangheum Hwang
UQCV
28
144
0
03 Jul 2020
Efficient Evaluation-Time Uncertainty Estimation by Improved Distillation
Erik Englesson
Hossein Azizpour
UQCV
19
8
0
12 Jun 2019
Evaluating Merging Strategies for Sampling-based Uncertainty Techniques in Object Detection
Dimity Miller
Feras Dayoub
Michael Milford
Niko Sünderhauf
23
105
0
17 Sep 2018
Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles
Balaji Lakshminarayanan
Alexander Pritzel
Charles Blundell
UQCV
BDL
276
5,695
0
05 Dec 2016
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
287
9,167
0
06 Jun 2015
1