AFAICT, this is the question ‘how much worse do your predictions get as you discard information in some orderly fashion?’, as framed by physicists.

Do “renormalisation groups”, whatever they are, fit in here? Fast-slow systems?

## Persistent Homology

What’s that? Petri et al. (2014):

Persistent homology is a recent technique in computational topology developed for shape recognition and the analysis of high dimensional datasets.… The central idea is the construction of a sequence of successive approximations of the original dataset seen as a topological space X. This sequence of topological spaces \(X_0, X_1, \dots{}, X_N = X\) is such that \(X_i \subseteq X_j\) whenever \(i < j\) and is called the filtration.

## References

Bar-Sinai, Yohai, Stephan Hoyer, Jason Hickey, and Michael P. Brenner. 2019. “Learning Data-Driven Discretizations for Partial Differential Equations.”

*Proceedings of the National Academy of Sciences*116 (31): 15344–49.Bar-Yam, Yaneer. 2003.

*Dynamics Of Complex Systems*. Westview Press.Castiglione, Patrizia, and Massimo Falcioni. 2008.

*Chaos and Coarse Graining in Statistical Mechanics*. Cambridge, UK ; New York: Cambridge University Press.Kelly, David, and Ian Melbourne. 2014. “Deterministic Homogenization for Fast-Slow Systems with Chaotic Noise,” September.

Petri, G., P. Expert, F. Turkheimer, R. Carhart-Harris, D. Nutt, P. J. Hellyer, and F. Vaccarino. 2014. “Homological Scaffolds of Brain Functional Networks.”

*Journal of The Royal Society Interface*11 (101): 20140873.Plis, Sergey, David Danks, and Jianyu Yang. 2015. “Mesochronal Structure Learning.”

*Uncertainty in Artificial Intelligence : Proceedings of the … Conference. Conference on Uncertainty in Artificial Intelligence*31 (July).
## No comments yet. Why not leave one?