Unique Information via Dependency Constraints

Abstract
The partial information decomposition is perhaps the leading proposal for resolving shared information in a joint random variable into redundant, synergistic, and unique constituents. Unfortunately, the framework has been hindered by a lack of a generally agreed-upon, multivariate method of quantifying the constituents. Here, we take a step toward rectifying this by developing a decomposition based on a new method that quantifies unique information. The result is the first measure that satisfies the core axioms of the framework while also not treating statistically identical but independent channels as redundant. This marks a key step forward in the practical application of the partial information decomposition.
View on arXivComments on this paper