Obtaining meaningful quantitative descriptions of the statistical dependence within multivariate systems is a difficult open problem. Recently, the Partial Information Decomposition (PID) framework was proposed to decompose mutual information about a target variable within a set of predictor variables into components which are redundant, unique and synergistic within different subsets of predictors. However, the details of how to implement this framework in practice are still debated. Here, we propose to apply the elegant formalism of the PID to multivariate entropy, resulting in a Partial Entropy Decomposition (PED). We implement the PED with an entropy redundancy measure based on pointwise common surprisal; a natural definition which is closely related to the definition of mutual information. We show how this approach can reveal the dyadic vs triadic generative structure of multivariate systems that are indistinguishable with classical Shannon measures. The entropy perspective also shows that misinformation is synergistic entropy and hence that mutual information itself includes both redundant and synergistic effects. We show the relationships between the PED and mutual information in two predictors, and derive two alternative information decompositions, which we illustrate on several example systems. The new perspective provided by these developments helps to clarify some of the difficulties encountered with the PID approach and the resulting decompositions provide useful tools for practical data analysis across a range of application areas.
View on arXiv