Extropy: a complementary dual of entropy

This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We introduce extropy as the complementary dual measure of entropy. The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution; and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments of the refinement of a distribution, which concerned Shannon and Jaynes. The formal duality of entropy and extropy is specified via the relationship among the entropies and extropies of course and fine partitions. We also assess the differential extropy and relative extropy for densities, showing that relative extropy constitutes a natural complement to the Kullback-Leibler divergence. These results are unified within the general structure of Bregman divergences. In this context they identify the Euclidean distance measure as dual to the entropic measure. We describe a statistical application to the scoring of sequential forecast distributions which provoked the discovery.
View on arXiv