Empirical estimation of information

Informing yourself from your data how informative your data was



This is an empirical probability metric estimation problem, with especially cruel error properties. There are a few different versions of this problem corresponding to various different information: Mutual information between two variables, KL divergence between two distributions, information of one variable; discrete variables, continuous variable… In the mutual information case this is an independence test.

Say I would like to know the mutual information of the laws of processes generating two streams \(X,Y\) of observations, with weak assumptions on the laws of the generation process. Better, suppose further that each observation from each process is i.i.d. In the case that they have a continuous state space and joint densities \(p_{X,Y}\), marginal densities \(p_{X},p_{Y},\)

\[ \operatorname {I} (X;Y)=\int _{\mathcal {Y}}\int _{\mathcal {X}}{p_{X,Y}X,Y\log {\left({\frac {p_{X,Y}X,Y}{p_{X}(x)\,p_{Y}(y)}}\right)}}\;dx\,dy\]

Information is harder than the many metrics, because observations with low frequency have high influence on that value but are by definition rarely observed. It is easy to get a uselessly biased — or even inconsistent — estimator, especially in the nonparametric case.

Histogram estimator

The obvious one for discrete data. For continuous data where the histogram bins must also be learned, this method is highly sensitive and can be inconsistent if you don’t do it right (Paninski 2003).

Parametric

🏗

Monte Carlo parametric

One case you might want to estimate this value which is one where there is no nonparametric estimation problem per se but the integral to solve it is inconvenient. In which case, we might use a Monte Carlo method.

John Schulmann explicates a good trick for estimating KL divergence in the case that you can simulate from \(x_i\sim q\) and calculate \(p(x)\) and \(q(x_i),\) The following estimator is good despite looking unrelated:

\[\begin{aligned} KL[q, p] &= \int_x q(x) \log \frac{q(x)}{p(x)} \mathrm{d}x\\ &= E_{ x \sim q}\left[\log \frac{q(x)}{p(x)} \right]\\ &\approx \frac1N \sum_{i=1}^N \frac12(\log ⁡p(x)−\log ⁡q(x))^2 \end{aligned}\]

He also introduced a simple debiased one that does even better. The mechanics are interesting. (If you actually want a mutual information this notionally calculates it if we find the KL divergence between joint and product densities; But that is not totally trivial I shall concede.)

Incoming

Inform: A C library for information analysis of complex systems

Inform is a cross-platform C library designed for performing information analysis of complex system.

  • The inform_dist struct provides discrete, emperical probability distributions. These form the basis for all of the information-theoretic measures.
  • A collection of information measures built upon the distribution struct provide the core algorithms for the library, and are provided through the shannon.h header.
  • A host of measures of the information dynamics on time series are built upon the core information measures. Each measure is housed in its own header, e.g. active_info.h.

See also ITE toolbox (estimators).

Information Theoretical Estimators (ITE) in Python

It

  • is the redesigned, Python implementation of the Matlab/Octave ITE toolbox.
  • can estimate numerous entropy, mutual information, divergence, association measures, cross quantities, and kernels on distributions.
  • can be used to solve information theoretical optimization problems in a high-level way.
  • comes with several demos.
  • is free and open source: GNU GPLv3(>=).

Estimated quantities:

  • entropy (H): Shannon entropy, Rényi entropy, Tsallis entropy (Havrda and Charvát entropy), Sharma-Mittal entropy, Phi-entropy (f-entropy).
  • mutual information (I): Shannon mutual information (total correlation, multi-information), Rényi mutual information, Tsallis mutual information, chi-square mutual information (squared-loss mutual information, mean square contingency), L2 mutual information, copula-based kernel dependency, kernel canonical correlation analysis, kernel generalized variance, multivariate version of Hoeffding’s Phi, Hilbert-Schmidt independence criterion, distance covariance, distance correlation, Lancaster three-variable interaction.
  • divergence (D): Kullback-Leibler divergence (relative entropy, I directed divergence), Rényi divergence, Tsallis divergence, Sharma-Mittal divergence, Pearson chi-square divergence (chi-square distance), Hellinger distance, L2 divergence, f-divergence (Csiszár-Morimoto divergence, Ali-Silvey distance), maximum mean discrepancy (kernel distance, current distance), energy distance (N-distance; specifically the Cramer-Von Mises distance), Bhattacharyya distance, non-symmetric Bregman distance (Bregman divergence), symmetric Bregman distance, J-distance (symmetrised Kullback-Leibler divergence, J divergence), K divergence, L divergence, Jensen-Shannon divergence, Jensen-Rényi divergence, Jensen-Tsallis divergence.
  • association measures (A): multivariate extensions of Spearman’s rho (Spearman’s rank correlation coefficient, grade correlation coefficient), multivariate conditional version of Spearman’s rho, lower and upper tail dependence via conditional Spearman’s rho.
  • cross quantities (C): cross-entropy,
  • kernels on distributions (K): expected kernel (summation kernel, mean map kernel, set kernel, multi-instance kernel, ensemble kernel; specific convolution kernel), probability product kernel, Bhattacharyya kernel (Bhattacharyya coefficient, Hellinger affinity), Jensen-Shannon kernel, Jensen-Tsallis kernel, exponentiated Jensen-Shannon kernel, exponentiated Jensen-Rényi kernels, exponentiated Jensen-Tsallis kernels.
  • conditional entropy (condH): conditional Shannon entropy.
  • conditional mutual information (condI): conditional Shannon mutual information.

References

Akaike, Hirotogu. 1973. Information Theory and an Extension of the Maximum Likelihood Principle.” In Proceeding of the Second International Symposium on Information Theory, edited by Petrovand F Caski, 199–213. Budapest: Akademiai Kiado.
Amigó, José M, Janusz Szczepański, Elek Wajnryb, and Maria V Sanchez-Vives. 2004. Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity.” Neural Computation 16 (4): 717–36.
Crumiller, Marshall, Bruce Knight, Yunguo Yu, and Ehud Kaplan. 2011. Estimating the Amount of Information Conveyed by a Population of Neurons.” Frontiers in Neuroscience 5: 90.
Elith, Jane, Steven J. Phillips, Trevor Hastie, Miroslav Dudík, Yung En Chee, and Colin J. Yates. 2011. A Statistical Explanation of MaxEnt for Ecologists.” Diversity and Distributions 17 (1): 43–57.
Gao, Shuyang, Greg Ver Steeg, and Aram Galstyan. 2015. Efficient Estimation of Mutual Information for Strongly Dependent Variables.” In Journal of Machine Learning Research, 277–86.
Goodman, Joshua. 2002. Extended Comment on Language Trees and Zipping.” arXiv:cond-Mat/0202383, February.
Grassberger, Peter. 1988. Finite Sample Corrections to Entropy and Dimension Estimates.” Physics Letters A 128 (6–7): 369–73.
Haslinger, Robert, Kristina Lisa Klinkner, and Cosma Rohilla Shalizi. 2010. The Computational Structure of Spike Trains.” Neural Computation 22 (1): 121–57.
Hausser, Jean, and Korbinian Strimmer. 2009. “Entropy Inference and the James-Stein Estimator, with Application to Nonlinear Gene Association Networks.” Journal of Machine Learning Research 10: 1469.
Kandasamy, Kirthevasan, Akshay Krishnamurthy, Barnabas Poczos, Larry Wasserman, and James M. Robins. 2014. Influence Functions for Machine Learning: Nonparametric Estimators for Entropies, Divergences and Mutual Informations.” arXiv:1411.4342 [Stat], November.
Kraskov, Alexander, Harald Stögbauer, and Peter Grassberger. 2004. Estimating Mutual Information.” Physical Review E 69: 066138.
Kulhavý, Rudolf. 1990. Recursive Nonlinear Estimation: A Geometric Approach.” Automatica 26 (3): 545–55.
Leinster, Tom. 2021. Entropy and Diversity: The Axiomatic Approach.” arXiv.
Marzen, S. E., and J. P. Crutchfield. 2020. Inference, Prediction, and Entropy-Rate Estimation of Continuous-Time, Discrete-Event Processes.” arXiv:2005.03750 [Cond-Mat, Physics:nlin, Stat], May.
Moon, Kevin R., and Alfred O. Hero III. 2014. Multivariate f-Divergence Estimation With Confidence.” In NIPS 2014.
Nemenman, Ilya, William Bialek, and Rob de Ruyter van Steveninck. 2004. Entropy and Information in Neural Spike Trains: Progress on the Sampling Problem.” Physical Review E 69 (5): 056111.
Nemenman, Ilya, Fariel Shafee, and William Bialek. 2001. Entropy and Inference, Revisited.” In arXiv:physics/0108025.
Paninski, Liam. 2003. Estimation of Entropy and Mutual Information.” Neural Computation 15 (6): 1191–1253.
Roulston, Mark S. 1999. Estimating the Errors on Measured Entropy and Mutual Information.” Physica D: Nonlinear Phenomena 125 (3-4): 285–94.
Rubinfeld, Ronitt. 2012. Taming Big Probability Distributions.” XRDS: Crossroads, The ACM Magazine for Students 19 (1): 24–28.
Schürmann, Thomas. 2015. A Note on Entropy Estimation.” Neural Computation 27 (10): 2097–2106.
Shibata, Ritei. 1997. “Bootstrap Estimate of Kullback-Leibler Information for Model Selection.” Statistica Sinica 7: 375–94.
Song, Jiaming, and Stefano Ermon. 2020. Understanding the Limitations of Variational Mutual Information Estimators.” In, 18.
Taylor, Samuel F, Naftali Tishby, and William Bialek. 2007. “Information and Fitness.” Arxiv Preprint arXiv:0712.4382.
Wolf, David R., and David H. Wolpert. 1994. Estimating Functions of Distributions from A Finite Set of Samples, Part 2: Bayes Estimators for Mutual Information, Chi-Squared, Covariance and Other Statistics.” arXiv:comp-Gas/9403002, March.
Wolpert, David H., and David R. Wolf. 1994. Estimating Functions of Probability Distributions from a Finite Set of Samples, Part 1: Bayes Estimators and the Shannon Entropy.” arXiv:comp-Gas/9403001, March.
Zhang, Zhiyi, and Michael Grabchak. 2014. Nonparametric Estimation of Küllback-Leibler Divergence.” Neural Computation 26 (11): 2570–93.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.