69
0

Balanced Hyperbolic Embeddings Are Natural Out-of-Distribution Detectors

Main:19 Pages
10 Figures
Bibliography:5 Pages
9 Tables
Abstract

Out-of-distribution recognition forms an important and well-studied problem in deep learning, with the goal to filter out samples that do not belong to the distribution on which a network has been trained. The conclusion of this paper is simple: a good hierarchical hyperbolic embedding is preferred for discriminating in- and out-of-distribution samples. We introduce Balanced Hyperbolic Learning. We outline a hyperbolic class embedding algorithm that jointly optimizes for hierarchical distortion and balancing between shallow and wide subhierarchies. We then use the class embeddings as hyperbolic prototypes for classification on in-distribution data. We outline how to generalize existing out-of-distribution scoring functions to operate with hyperbolic prototypes. Empirical evaluations across 13 datasets and 13 scoring functions show that our hyperbolic embeddings outperform existing out-of-distribution approaches when trained on the same data with the same backbones. We also show that our hyperbolic embeddings outperform other hyperbolic approaches, beat state-of-the-art contrastive methods, and natively enable hierarchical out-of-distribution generalization.

View on arXiv
@article{kasarla2025_2506.10146,
  title={ Balanced Hyperbolic Embeddings Are Natural Out-of-Distribution Detectors },
  author={ Tejaswi Kasarla and Max van Spengler and Pascal Mettes },
  journal={arXiv preprint arXiv:2506.10146},
  year={ 2025 }
}
Comments on this paper