In information theory, dual total correlation, [1] information rate, [2] excess entropy, [3] [4] or binding information [5] is one of several known non-negative generalizations of mutual information. While total correlation is bounded by the sum entropies of the n elements, the dual total correlation is bounded by the joint-entropy of the n elements. Although well behaved, dual total correlation has received much less attention than the total correlation. A measure known as "TSE-complexity" defines a continuum between the total correlation and dual total correlation. [3]
For a set of n random variables , the dual total correlation is given by
where is the joint entropy of the variable set and is the conditional entropy of variable , given the rest.
The dual total correlation normalized between [0,1] is simply the dual total correlation divided by its maximum value ,
Dual total correlation is non-negative and bounded above by the joint entropy .
Secondly, Dual total correlation has a close relationship with total correlation, , and can be written in terms of differences between the total correlation of the whole, and all subsets of size : [6]
where and
Furthermore, the total correlation and dual total correlation are related by the following bounds:
Finally, the difference between the total correlation and the dual total correlation defines a novel measure of higher-order information-sharing: the O-information: [7]
The O-information (first introduced as the "enigmatic information" by James and Crutchfield [8] is a signed measure that quantifies the extent to which the information in a multivariate random variable is dominated by synergistic interactions (in which case ) or redundant interactions (in which case , and have found multiple applications in neuroscience. [9]
Han (1978) originally defined the dual total correlation as,
However Abdallah and Plumbley (2010) showed its equivalence to the easier-to-understand form of the joint entropy minus the sum of conditional entropies via the following: