Ergodicity and mixing

Things that probably happen eventually on average



πŸ—

Relevance to actual stochastic processes and dynamical systems, especially linear and non-linear system identification.

Keywords to look up:

  • probability-free ergodicity
  • Birkhoff ergodic theorem
  • Frobenius-Perron operator
  • Quasicompactness, correlation decay
  • C&C CLT for Markov chains β€” Nagaev

Not much material, but please see learning theory for dependent data for some interesting categorisations of mixing and transcendence of miscellaneous mixing conditions for statistical estimators.

My main interest is the following 4-stages-of-grief kind of set up.

  1. Often I can prove that I can learn a thing from my data if it is stationary.
  2. But I rarely have stationarity, so at least showing the estimator is ergodic might be more useful, which would follow from some appropriate mixing conditions which do not necessarily assume stationarity.
  3. Except that often these theorems are hard to show, or estimate, or require knowing the parameters in question, and maybe I might suspect that showing some kind of partial identifiability might be more what I need.
  4. Furthermore, I usually would prefer a finite-sample result instead of some asymptotic guarantee. Sometimes I can get those from learning theory for dependent data.

That last one is TBC.

Coupling from the past

Dan Piponi explains coupling from the past via functional programming for Markov chains.

Mixing zoo

A recommended partial overview is BradleyBasic2005. πŸ—

Ξ²-mixing

πŸ—

Ο•-mixing

πŸ—

Sequential Rademacher complexity

πŸ—

Lyapunov exponents

John D Cook says

Chaotic systems are unpredictable. Or rather chaotic systems are not deterministically predictable in the long run. You can make predictions if you weaken one of these requirements. You can make deterministic predictions in the short run, or statistical predictions in the long run. Lyapunov exponents are a way to measure how quickly the short run turns into the long run.

References

Berry, Tyrus, Dimitrios Giannakis, and John Harlim. 2020. β€œBridging Data Science and Dynamical Systems Theory.” arXiv:2002.07928 [Physics, Stat], June.
Bradley, Richard C. 2005. β€œBasic Properties of Strong Mixing Conditions. A Survey and Some Open Questions.” Probability Surveys 2: 107–44.
BrΓ©maud, Pierre, and Laurent MassouliΓ©. 2001. β€œHawkes Branching Point Processes Without Ancestors.” Journal of Applied Probability 38 (1): 122–35.
Delft, Anne van, and Michael Eichler. 2016. β€œLocally Stationary Functional Time Series.” arXiv:1602.05125 [Math, Stat], February.
Diaconis, Persi, and David Freedman. 1999. β€œIterated Random Functions.” SIAM Review 1 (1): 45–76.
Gray, Robert M. 1987. Probability, Random Processes, and Ergodic Properties. Springer.
Keane, Michael, and Karl Petersen. 2006. β€œEasy and Nearly Simultaneous Proofs of the Ergodic Theorem and Maximal Ergodic Theorem.” IMS Lecture Notes-Monograph Series Dynamics & Stochastics 48.
Kuznetsov, Vitaly, and Mehryar Mohri. 2014. β€œGeneralization Bounds for Time Series Prediction with Non-Stationary Processes.” In Algorithmic Learning Theory, edited by Peter Auer, Alexander Clark, Thomas Zeugmann, and Sandra Zilles, 260–74. Lecture Notes in Computer Science. Bled, Slovenia: Springer International Publishing.
β€”β€”β€”. 2016. β€œGeneralization Bounds for Non-Stationary Mixing Processes.” In Machine Learning Journal.
Livan, Giacomo, Jun-ichi Inoue, and Enrico Scalas. 2012. β€œOn the Non-Stationarity of Financial Time Series: Impact on Optimal Portfolio Selection.” Journal of Statistical Mechanics: Theory and Experiment 2012 (07): P07025.
McDonald, Daniel J., Cosma Rohilla Shalizi, and Mark Schervish. 2011. β€œRisk Bounds for Time Series Without Strong Mixing.” arXiv:1106.0730 [Cs, Stat], June.
Mohri, Mehryar, and Afshin Rostamizadeh. 2009. β€œStability Bounds for Stationary Ο•-Mixing and Ξ²-Mixing Processes.” Journal of Machine Learning Research 4: 1–26.
Morvai, GusztΓ‘v, Sidney Yakowitz, and LΓ‘szlΓ³ GyΓΆrfi. 1996. β€œNonparametric Inference for Ergodic, Stationary Time Series.” The Annals of Statistics 24 (1): 370–79.
Palmer, Richard G. 1982. β€œBroken Ergodicity.” Advances in Physics 31 (6): 669–735.
Propp, James Gary, and David Bruce Wilson. 1996. β€œExact Sampling with Coupled Markov Chains and Applications to Statistical Mechanics.” In Random Structures & Algorithms, 9:223–52. New York, NY, USA: John Wiley & Sons, Inc.
β€”β€”β€”. 1998. β€œCoupling from the Past: A User’s Guide.” In Microsurveys in Discrete Probability, edited by David Aldous and James Gary Propp, 41:181–92. DIMACS Series in Discrete Mathematics and Theoretical Computer Science. Providence, Rhode Island: American Mathematical Society.
Rosenblatt, M. 1984. β€œAsymptotic Normality, Strong Mixing and Spectral Density Estimates.” The Annals of Probability 12 (4): 1167–80.
Ryabko, Daniil, and Boris Ryabko. 2010. β€œNonparametric Statistical Inference for Ergodic Processes.” IEEE Transactions on Information Theory 56 (3): 1430–35.
Shao, Xiaofeng, and Wei Biao Wu. 2007. β€œAsymptotic Spectral Theory for Nonlinear Time Series.” The Annals of Statistics 35 (4): 1773–1801.
Shields, P C. 1998. β€œThe Interactions Between Ergodic Theory and Information Theory.” IEEE Transactions on Information Theory 44 (6): 2079–93.
Steif, Jeffrey E. 1997. β€œConsistent Estimation of Joint Distributions for Sufficiently Mixing Random Fields.” The Annals of Statistics 25 (1): 293–304.
Stein, D L, and C M Newman. 1995. β€œBroken Ergodicity and the Geometry of Rugged Landscapes.” Physical Review E 51 (6): 5228–38.
Thouvenot, Jean-Paul, and Benjamin Weiss. 2012. β€œLimit Laws for Ergodic Processes.” Stochastics and Dynamics 12 (01): 1150012.

1 comment

GitHub-flavored Markdown & a sane subset of HTML is supported.