Entropy vs information
MaxEnt, macrostates, subjective updating, epistemic randomness, Szilard engines, Gibbs paradox…
December 2, 2010 — May 22, 2024
Over at statistical mechanics of statistics we wonder about the connection between statistical mechanics and statistics, which suggests we might consider the connection between entropy and information. Entropy, the physics concept, and information, the computer science concept, looked danged similar and yet they are defined for different things. How do they connect?
I first looked into this 10+ years ago, but I have had my interest piqued again after ignoring it for ages, because de Vries claims to have actioned the idea within the predictive coding theory of mind, which gives the entire idea somewhat more credibility, and inclines me to return to the original MaxEnt work by Caticha, which now has textbooks about it (Caticha 2015, 2008).
Connected somehow: algorithmic statistics, information geometry.
Shalizi and Moore (2003):
We consider the question of whether thermodynamic macrostates are objective consequences of dynamics, or subjective reflections of our ignorance of a physical system. We argue that they are both; more specifically, that the set of macrostates forms the unique maximal partition of phase space which 1) is consistent with our observations (a subjective fact about our ability to observe the system) and 2) obeys a Markov process (an objective fact about the system’s dynamics). We review the ideas of computational mechanics, an informationtheoretic method for finding optimal causal models of stochastic processes, and argue that macrostates coincide with the “causal states” of computational mechanics. Defining a set of macrostates thus consists of an inductive process where we start with a given set of observables, and then refine our partition of phase space until we reach a set of states which predict their own future, i.e. which are Markovian. Macrostates arrived at in this way are provably optimal statistical predictors of the future values of our observables.
Statistical Physics of Inference and Bayesian Estimation Informational entropy versus thermodynamics entropy.
John Baez’s A Characterisation of Entropy etc. See also
Wolpert (2006)
Daniel Ellerman’s Logical Entropy stuff and which he has now written up as Ellerman (2017).
Feldman, A Brief Introduction to: Information Theory, Excess Entropy and Computational Mechanics
It Took Me 10 Years to Understand Entropy, Here is What I Learned.  by Aurelien Pelissier

“This article introduces both a new algorithm for reconstructing epsilonmachines from data, as well as the decisional states. These are defined as the internal states of a system that lead to the same decision, based on a userprovided utility or payoff function.”
CRS’s CSSR