Learning in complicated systems where we know that there is a conservation law in effect. Or, more advanced, learning a conservation law that we did not know was in effect. As seen in especially ML for physics. This is not AFAIK a particular challenge in traditional parametric statistics where we can usually impose conservation laws on a problem through the likelihood, but in nonparametric models, or models with overparameterisation such as neural nets this can get fiddly. Where does conservation of mass, momentum, energy etc reside in a convnet? Or what if we do not need to conserve a quantity exactly but which to regularise the system towards a conservation law?

POssibly also in this category:
There is a particular type of statistical-mechanical law on the learning process itself, specifically, energy-conservation in neural net signal propagation, which is not a conservation law in the regression model *per se*, but stability guarantee on the model and its traiing process.
This is the deep learning as dynamical system trick.

Without trying to impose constraints on the model, there are a whole bunch of conservation laws and symmetries exploited in *training procedures*, for example in the potential theory, in the statistical mechanics of learning, in the use of conservation laws in Hamiltonian Monte Carlo.

## Incoming

What is Learning invariant representations trick?

Recent entrants in this area:

- Lagrangian Neural networks.
- C&C (M. Cranmer et al. 2020; Lutter, Ritter, and Peters 2019).
- permutation invariance

we demonstrate principled advantages of enforcing conservation laws of the form gφ(fθ(x))=gφ(x) by considering a special case where preimages under gφ form affine subspaces.

Noether networks look interesting.

Also related, learning with a PDE constraint.

Not sure where Dax et al. (2021) fits in.

Erik Bekkers’ seminar on Group Equivariant Deep Learning looks interesting.

## Permutation invariance and equivariance

See deep sets.

## References

*Nature Computational Science*2 (7): 433–42.

*Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS)*, 6.

*arXiv:2003.04630 [Physics, Stat]*, July.

*arXiv:2111.13139 [Astro-Ph, Physics:gr-Qc, Stat]*, November.

*Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS)*, 6.

*arXiv:2104.05508 [Cs, Stat]*, April.

*Advances in Neural Information Processing Systems 32*, edited by H. Wallach, H. Larochelle, A. Beygelzimer, F. d Alché-Buc, E. Fox, and R. Garnett, 15379–89. Curran Associates, Inc.

*arXiv:2010.07033 [Physics, Stat]*.

*Proceedings of the 20th Conference of the International Society for Music Information Retrieval*, 8.

*arXiv:1907.04490 [Cs, Eess, Stat]*, July.

*arXiv:2002.00021 [Physics]*, February.

*Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS)*, 8.

*arXiv:1701.02440 [Cs, Math, Stat]*, January.

*Journal of Computational Physics*378 (February): 686–707.

*2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)*, 2241–45.

*Journal of Computational Physics*394 (October): 56–81.

## No comments yet. Why not leave one?