Learning with conservation laws, invariances and symmetries

April 11, 2020 — February 25, 2022

algebra
how do science
information
machine learning
networks
physics
probability
sciml
statistics
statmech
Figure 1: Failure of conservation of mass at system boundaries is a common problem in models with nonparametric likelihood

Learning in complicated systems where we know that there is a conservation law in effect. Or, more advanced, learning a conservation law that we did not know was in effect. As seen in especially ML for physics. Possibly this is the same idea as learning on manifolds, but the literatures do not seem to be closely connected. This is not AFAIK a particular challenge in traditional parametric statistics where we can usually impose conservation laws on a problem through the likelihood, but in nonparametric models, or models with overparameterisation such as neural nets this can get fiddly. Where does conservation of mass, momentum, energy etc reside in a convnet? Or what if we do not need to conserve a quantity exactly but wish to regularise the system towards a conservation law?

Possibly also in this category: There is a particular type of statistical-mechanical law on the learning process itself, specifically, energy-conservation in neural net signal propagation, which is not a conservation law in the regression model per se, but stability guarantee on the model and its training process. This is the deep learning as dynamical system trick.

Without trying to impose constraints on the model, there are a whole bunch of conservation laws and symmetries exploited in training procedures, for example in the potential theory, in the statistical mechanics of learning, in the use of conservation laws in Hamiltonian Monte Carlo.

AFAICT the state of the art is reviewed well in Bloem-Reddy and Teh (2020).

1 Incoming

What is Learning invariant representations trick?

Recent entrants in this area:

we demonstrate principled advantages of enforcing conservation laws of the form gφ(fθ(x))=gφ(x) by considering a special case where preimages under gφ form affine subspaces.

Noether networks look interesting.

Also related, learning with a PDE constraint.

Not sure where Dax et al. (2021) fits in.

Erik Bekkers’ seminar on Group Equivariant Deep Learning looks interesting.

2 Permutation invariance and equivariance

See deep sets.

3 References

Azangulov, Smolensky, Terenin, et al. 2022. Stationary Kernels and Gaussian Processes on Lie Groups and Their Homogeneous Spaces I: The Compact Case.”
Bloem-Reddy, and Teh. 2020. Probabilistic Symmetries and Invariant Neural Networks.”
Chen, Huang, Raghupathi, et al. 2022. Automated Discovery of Fundamental Variables Hidden in Experimental Data.” Nature Computational Science.
Cranmer, Miles, Greydanus, Hoyer, et al. 2020. Lagrangian Neural Networks.” arXiv:2003.04630 [Physics, Stat].
Cranmer, Miles D, Xu, Battaglia, et al. 2019. “Learning Symbolic Physics with Graph Networks.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS).
Dax, Green, Gair, et al. 2021. Group Equivariant Neural Posterior Estimation.” arXiv:2111.13139 [Astro-Ph, Physics:gr-Qc, Stat].
Erichson, Muehlebach, and Mahoney. 2019. “Physics-Informed Autoencoders for Lyapunov-Stable Fluid Flow Prediction.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS).
Głuch, and Urbanke. 2021. Noether: The More Things Change, the More Stay the Same.” arXiv:2104.05508 [Cs, Stat].
Greydanus, Dzamba, and Yosinski. 2019. Hamiltonian Neural Networks.” In Advances in Neural Information Processing Systems 32.
Kofinas, Bekkers, Nagaraja, et al. 2023. Latent Field Discovery in Interacting Dynamical Systems with Neural Fields.” Advances in Neural Information Processing Systems.
Krämer, Köhler, and Noé. 2020. Training Invertible Linear Layers Through Rank-One Perturbations.” In arXiv:2010.07033 [Physics, Stat].
Lattner, Dorfler, and Arzt. 2019. Learning Complex Basis Functions for Invariant Representations of Audio.” In Proceedings of the 20th Conference of the International Society for Music Information Retrieval.
Lutter, Ritter, and Peters. 2019. Deep Lagrangian Networks: Using Physics as Model Prior for Deep Learning.” arXiv:1907.04490 [Cs, Eess, Stat].
Mohan, Lubbers, Livescu, et al. 2020. Embedding Hard Physical Constraints in Neural Network Coarse-Graining of 3D Turbulence.” arXiv:2002.00021 [Physics].
Mototake. 2019. “Conservation Law Estimation by Extracting the Symmetry of a Dynamical System Using a Deep Neural Network.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS).
Popov. 2022. Combining Data-Driven and Theory-Guided Models in Ensemble Data Assimilation.” ETD.
Raissi, Perdikaris, and Karniadakis. 2017. Machine Learning of Linear Differential Equations Using Gaussian Processes.” Journal of Computational Physics.
Raissi, Perdikaris, and Karniadakis. 2019. Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations.” Journal of Computational Physics.
Rezende, Racanière, Higgins, et al. 2019. “Equivariant Hamiltonian Flows.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS).
Sanchez-Gonzalez, Bapst, Battaglia, et al. 2019. “Hamiltonian Graph Networks with ODE Integrators.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS).
Smets, Portegies, Bekkers, et al. 2023. PDE-Based Group Equivariant Convolutional Neural Networks.” Journal of Mathematical Imaging and Vision.
Thickstun, Harchaoui, Foster, et al. 2018. Invariances and Data Augmentation for Supervised Music Transcription.” In 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
Wang, Walters, and Yu. 2022. Data Augmentation Vs. Equivariant Networks: A Theory of Generalization on Dynamics Forecasting.”
Zhang, Wang, Helwig, et al. 2023. Artificial Intelligence for Science in Quantum, Atomistic, and Continuum Systems.”
Zhu, Zabaras, Koutsourelakis, et al. 2019. Physics-Constrained Deep Learning for High-Dimensional Surrogate Modeling and Uncertainty Quantification Without Labeled Data.” Journal of Computational Physics.