Learning on manifolds

Finding the lowest bit of a krazy straw, from the inside

October 21, 2011 — January 26, 2022

A placeholder for learning on curved spaces. Not discussed: learning OF the curvature of spaces.

AFAICT this usually boils down to defining an appropriate stochastic process on a manifold.

1 Learning on a given manifold

Learning where there is an a priori manifold seems to also be a usage here? For example the manifold of positive definite matrices is treated in depth in Chikuse and 筑瀬 (2003).

See the work of, e.g.

Manifold optimisation implementations:

There are at least two textbooks online:

2 Information Geometry

The unholy offspring of Fisher information and differential geometry, about which I know little except that it sounds like it should be intuitive. It is probably synonymous with some of the other items on this page if I could sort out all this terminology. See information geometry.

3 Hamiltonian Monte Carlo

You can also discuss Hamiltonian Monte Carlo in this setting. I will not.

4 Langevin Monte Carlo

Girolami et al discuss Langevin Monte Carlo in this context.

5 Natural gradient

See natural gradients.

6 Homogeneous probability

Albert Tarantola’s framing, from his manuscript. How does it relate to information geometry? I don’t know yet. Haven’t had time to read. Also not a common phrasing, which is a danger sign.

7 Incoming

8 References

Absil, Mahony, and Sepulchre. 2008. Optimization algorithms on matrix manifolds.
Amari, Shunʼichi. 1987. “Differential Geometrical Theory of Statistics.” In Differential Geometry in Statistical Inference.
Amari, Shun-ichi. 1998. Natural Gradient Works Efficiently in Learning.” Neural Computation.
Amari, Shunʼichi. 2001. Information Geometry on Hierarchy of Probability Distributions.” IEEE Transactions on Information Theory.
Aswani, Bickel, and Tomlin. 2011. Regression on Manifolds: Estimation of the Exterior Derivative.” The Annals of Statistics.
Azangulov, Smolensky, Terenin, et al. 2022. Stationary Kernels and Gaussian Processes on Lie Groups and Their Homogeneous Spaces I: The Compact Case.”
Barndorff-Nielsen. 1987. “Differential and Integral Geometry in Statistical Inference.” In Differential Geometry in Statistical Inference.
Betancourt, Byrne, Livingstone, et al. 2017. The Geometric Foundations of Hamiltonian Monte Carlo.” Bernoulli.
Borovitskiy, Terenin, Mostowsky, et al. 2020. Matérn Gaussian Processes on Riemannian Manifolds.” arXiv:2006.10160 [Cs, Stat].
Boumal. 2013. On Intrinsic Cramér-Rao Bounds for Riemannian Submanifolds and Quotient Manifolds.” IEEE Transactions on Signal Processing.
———. 2020. An Introduction to Optimization on Smooth Manifolds.
Boumal, Mishra, Absil, et al. 2014. Manopt, a Matlab Toolbox for Optimization on Manifolds.” Journal of Machine Learning Research.
Boumal, Singer, Absil, et al. 2014. Cramér-Rao Bounds for Synchronization of Rotations.” Information and Inference.
Carlsson, Ishkhanov, Silva, et al. 2008. On the Local Behavior of Spaces of Natural Images.” International Journal of Computer Vision.
Chen, Silva, Paisley, et al. 2010. Compressive Sensing on Manifolds Using a Nonparametric Mixture of Factor Analyzers: Algorithm and Performance Bounds.” IEEE Transactions on Signal Processing.
Chikuse, and 筑瀬. 2003. Statistics on Special Manifolds.
Fernández-Martínez, Fernández-Muñiz, Pallero, et al. 2013. From Bayes to Tarantola: New Insights to Understand Uncertainty in Inverse Problems.” Journal of Applied Geophysics.
França, Barp, Girolami, et al. 2021. Optimization on Manifolds: A Symplectic Approach.”
Ge, and Ma. 2017. On the Optimization Landscape of Tensor Decompositions.” In Advances In Neural Information Processing Systems.
Girolami, and Calderhead. 2011. Riemann Manifold Langevin and Hamiltonian Monte Carlo Methods.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Głuch, and Urbanke. 2021. Noether: The More Things Change, the More Stay the Same.” arXiv:2104.05508 [Cs, Stat].
Hosseini, and Sra. 2015. Manifold Optimization for Gaussian Mixture Models.” arXiv Preprint arXiv:1506.07677.
Huang, Absil, Gallivan, et al. 2018. ROPTLIB: An Object-Oriented C++ Library for Optimization on Riemannian Manifolds.” ACM Transactions on Mathematical Software.
Lauritzen. 1987. “Statistical Manifolds.” In Differential Geometry in Statistical Inference.
Ley, Babić, and Craens. 2021. Flexible Models for Complex Data with Applications.” Annual Review of Statistics and Its Application.
Lezcano Casado. 2019. Trivializations for Gradient-Based Optimization on Manifolds.” In Advances in Neural Information Processing Systems.
Manton. 2013. A Primer on Stochastic Differential Geometry for Signal Processing.” IEEE Journal of Selected Topics in Signal Processing.
Mardia, and Jupp. 2009. Directional Statistics.
Martin, Raim, Huang, et al. 2016. ManifoldOptim: An R Interface to the ROPTLIB Library for Riemannian Manifold Optimization.”
Miolane, Mathe, Donnat, et al. 2018. Geomstats: A Python Package for Riemannian Geometry in Machine Learning.” arXiv:1805.08308 [Cs, Stat].
Mosegaard, and Tarantola. 1995. Monte Carlo Sampling of Solutions to Inverse Problems.” Journal of Geophysical Research: Solid Earth.
Mukherjee, Wu, and Zhou. 2010. Learning Gradients on Manifolds.” Bernoulli.
Peters. 2010. Policy Gradient Methods.” Scholarpedia.
Popov. 2022. Combining Data-Driven and Theory-Guided Models in Ensemble Data Assimilation.” ETD.
Rao, Lin, and Dunson. n.d. “Bayesian Inference on the Stiefel Manifold.”
Saul. 2023. A Geometrical Connection Between Sparse and Low-Rank Matrices and Its Application to Manifold Learning.” Transactions on Machine Learning Research.
Seshadhri, Sharma, Stolman, et al. 2020. The Impossibility of Low-Rank Representations for Triangle-Rich Complex Networks.” Proceedings of the National Academy of Sciences.
Stam. 1982. Limit Theorems for Uniform Distributions on Spheres in High-Dimensional Euclidean Spaces.” Journal of Applied Probability.
Steinke, and Hein. 2009. Non-Parametric Regression Between Manifolds.” In Advances in Neural Information Processing Systems 21.
Townsend, Koep, and Weichwald. 2016. Pymanopt: A Python Toolbox for Optimization on Manifolds Using Automatic Differentiation.” Journal of Machine Learning Research.
Transtrum, Machta, and Sethna. 2011. The Geometry of Nonlinear Least Squares with Applications to Sloppy Models and Optimization.” Physical Review E.
Wang, and Zhuang. 2016. Tight Framelets and Fast Framelet Transforms on Manifolds.” arXiv:1608.04026 [Math].
Xifara, Sherlock, Livingstone, et al. 2014. Langevin Diffusions and the Metropolis-Adjusted Langevin Algorithm.” Statistics & Probability Letters.