ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2404.17023
24
0

Out-of-Distribution Detection using Maximum Entropy Coding

25 April 2024
M. Abolfazli
Mohammad Zaeri Amirani
Anders Høst-Madsen
June Zhang
A. Bratincsák
    OOD
ArXivPDFHTML
Abstract

Given a default distribution PPP and a set of test data xM={x1,x2,…,xM}x^M=\{x_1,x_2,\ldots,x_M\}xM={x1​,x2​,…,xM​} this paper seeks to answer the question if it was likely that xMx^MxM was generated by PPP. For discrete distributions, the definitive answer is in principle given by Kolmogorov-Martin-L\"{o}f randomness. In this paper we seek to generalize this to continuous distributions. We consider a set of statistics T1(xM),T2(xM),…T_1(x^M),T_2(x^M),\ldotsT1​(xM),T2​(xM),…. To each statistic we associate its maximum entropy distribution and with this a universal source coder. The maximum entropy distributions are subsequently combined to give a total codelength, which is compared with −log⁡P(xM)-\log P(x^M)−logP(xM). We show that this approach satisfied a number of theoretical properties. For real world data PPP usually is unknown. We transform data into a standard distribution in the latent space using a bidirectional generate network and use maximum entropy coding there. We compare the resulting method to other methods that also used generative neural networks to detect anomalies. In most cases, our results show better performance.

View on arXiv
Comments on this paper