34

Mitigating the Likelihood Paradox in Flow-based OOD Detection via Entropy Manipulation

Donghwan Kim
Hyunsoo Yoon
Main:10 Pages
4 Figures
Bibliography:3 Pages
7 Tables
Appendix:15 Pages
Abstract

Deep generative models that can tractably compute input likelihoods, including normalizing flows, often assign unexpectedly high likelihoods to out-of-distribution (OOD) inputs. We mitigate this likelihood paradox by manipulating input entropy based on semantic similarity, applying stronger perturbations to inputs that are less similar to an in-distribution memory bank. We provide a theoretical analysis showing that entropy control increases the expected log-likelihood gap between in-distribution and OOD samples in favor of the in-distribution, and we explain why the procedure works without any additional training of the density model. We then evaluate our method against likelihood-based OOD detectors on standard benchmarks and find consistent AUROC improvements over baselines, supporting our explanation.

View on arXiv
Comments on this paper