Learning in complicated systems where we know that there is a conservation law in effect. Or, more advanced, learning a conservation law that we did not know was in effect. As seen in especially ML for physics. This is not AFAIK a particular challenge in traditional parametric statistics where we can impose conservation laws on a problem through the likelihood, but nonparametrics models, or models with overparameterisation such as neural nets this can get fiddly. Where does conservation of mass, momentum, energy etc reside in a convnet?

There is a particular type of conservation law which we frequently impose upon deep learning, specifically, energy-conservation in neural net signal propagation, which is not a conservation law in the regression model *per se*, but a conservation law that ensures the model itself is trainable.
This is the deep learning as dynamical system trick.
In fact, there are a whole bunch of conservation laws and symmetries implicit in what we do, for example in the potential theory, in the statistical mechanics of learning, in the use of conservation laws in Hamiltonian Monte Carlo but in deep learning these do not necessarily align with the symmetries and conservation laws of the subject matter.

I wonder if the Learning invariant representations idea could help.

Cranmer, Miles D, Rui Xu, Peter Battaglia, and Shirley Ho. 2019. “Learning Symbolic Physics with Graph Networks.” In *Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS)*, 6.

Erichson, N Benjamin, Michael Muehlebach, and Michael W Mahoney. 2019. “Physics-Informed Autoencoders for Lyapunov-Stable Fluid Flow Prediction.” In *Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS)*, 6.

Greydanus, Samuel, Misko Dzamba, and Jason Yosinski. 2019. “Hamiltonian Neural Networks.” In *Advances in Neural Information Processing Systems 32*, edited by H. Wallach, H. Larochelle, A. Beygelzimer, F. d\textquotesingle Alché-Buc, E. Fox, and R. Garnett, 15379–89. Curran Associates, Inc. http://papers.nips.cc/paper/9672-hamiltonian-neural-networks.pdf.

Lattner, Stefan, Monika Dorfler, and Andreas Arzt. 2019. “Learning Complex Basis Functions for Invariant Representations of Audio.” In *Proceedings of the 20th Conference of the International Society for Music Information Retrieval*, 8. http://archives.ismir.net/ismir2019/paper/000085.pdf.

Mohan, Arvind T., Nicholas Lubbers, Daniel Livescu, and Michael Chertkov. 2020. “Embedding Hard Physical Constraints in Neural Network Coarse-Graining of 3D Turbulence,” February. http://arxiv.org/abs/2002.00021.

Mototake, Yoh-ichi. 2019. “Conservation Law Estimation by Extracting the Symmetry of a Dynamical System Using a Deep Neural Network.” In *Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS)*, 8.

Raissi, Maziar, and George Em Karniadakis. 2017. “Machine Learning of Linear Differential Equations Using Gaussian Processes,” January. http://arxiv.org/abs/1701.02440.

Raissi, M., P. Perdikaris, and G. E. Karniadakis. 2019. “Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations.” *Journal of Computational Physics* 378 (February): 686–707. https://doi.org/10.1016/j.jcp.2018.10.045.

Rezende, Danilo J, Sébastien Racanière, Irina Higgins, and Peter Toth. 2019. “Equivariant Hamiltonian Flows.” In

Sanchez-Gonzalez, Alvaro, Victor Bapst, Peter Battaglia, and Kyle Cranmer. 2019. “Hamiltonian Graph Networks with ODE Integrators.” In

Wang, Rui, Rose Yu, Karthik Kashinath, and Mustafa Mustafa. n.d. “Towards Physics-Informed Deep Learning for Turbulent Flow Prediction.” In, 6.