358
v1v2 (latest)

Bayesian Metric Learning for Uncertainty Quantification in Image Retrieval

Neural Information Processing Systems (NeurIPS), 2023
Main:9 Pages
11 Figures
Bibliography:3 Pages
4 Tables
Appendix:14 Pages
Abstract

We propose the first Bayesian encoder for metric learning. Rather than relying on neural amortization as done in prior works, we learn a distribution over the network weights with the Laplace Approximation. We actualize this by first proving that the contrastive loss is a valid log-posterior. We then propose three methods that ensure a positive definite Hessian. Lastly, we present a novel decomposition of the Generalized Gauss-Newton approximation. Empirically, we show that our Laplacian Metric Learner (LAM) estimates well-calibrated uncertainties, reliably detects out-of-distribution examples, and yields state-of-the-art predictive performance.

View on arXiv
Comments on this paper