In physics, typically, we are concerned with identifying True Parameters for Universal Laws, applicable without prejudice across all the cosmos. We are hunting something like the Platonic ideals that our experiments are poor shadows of. Especially, say, quantum physics or cosmology.

In machine learning, typically we want to make generic predictions for a given process, and quantify how good those predictions can be given how much data we have and the approximate kind of process we witness, and there is no notion of universal truth waiting around the corner to back up our wild fancies. On the other hand, we are less concerned about the noisy sublunary chaos of experiments and don’t need to worry about how far our noise drives us from universal truth as long as we make good predictions in the local problem at hand. But here, far from universality, we have weak and vague notions of how to generalise our models to new circumstances and new noise. That is, in the Platonic ideal of machine learning, there are no Platonic ideals to be found.

(This explanation does no justice to either physics or machine learning, but it will do as framing rather than getting too deep into the history or philosophy of science.)

Can these areas have something to say to one another nevertheless? After an interesting conversation with Shane Keating about the difficulties of ocean dynamics, I am thinking about this in a new way; Generally, we might have notions from physics of what “truly” underlies a system, but where many unknown parameters, noisy measurements, computational intractability and complex or chaotic dynamics interfere with our ability to predict things using only known laws of physics; Here, we want to come up with a “best possible” stochastic model of a system given our uncertainties and constraints, which looks more like ML problem.

At a basic level, it’s not controversial (I don’t think?) to use machine learning methods to analyse data in experiments, even with trendy deep neural networks. I understand that this is significant, e.g. in connectomics.

Perhaps a little more fringe is using machine learning to reduce computational burden via surrogate models, e.g. Carleo and Troyer (2017).

The thing that is especially interesting to me right now is learning the whole model in ML formalism, using physical laws as input to the learning process.

To be concrete, Shane specifically was discussing problems in predicting and interpolating “tracers”, such as chemical or heat, in oceanographic flows. Here we know lots of things about the fluids concerned, but less about the details of the ocean floor and have imperfect measurements of the details. Nonetheless, we also know that there are certain invariants, conservation laws etc, so a truly “nonparametric” approach to dynamics is certainly throwing away information.

There is some cute work in learning approximations to physics, like the *SINDy* method, which is somehow at the intersection
compressive-sensing,
state filters and maybe even Koopman operators
(Brunton, Proctor, and Kutz 2016); but it’s hard to imagine
scaling this up (at least directly) to big things like large image sensor
arrays and other such weakly structured input.

Researchers like Chang et al. (2017) claim that learning “compositional object” models should be possible. The compositional models are learnable objects with learnable pairwise interactions, and bear a passing resemblance to something like the physical laws that physics experiments hope to discover, although I’m not yet totally persuaded about the details of this particular framework. On the other hand, unmotivated appealing to autoencoders as descriptions of underlying dynamics of physical reality doesn’t seem sufficient.

There is an O’Reilly podcast and reflist about deep learning for science in particular. There was a special track for papers in this area in NeurIPS.

Related: “sciml” which often seems to mean learning ODEs in particular, is important. See various SciML conferences, e.g. ICERM

Sample images of atmospheric rivers correctly classified (true positive) by our deep CNN model. Figure shows total column water vapor (color map) and land sea boundary (solid line). Y. Liu et al. (2016)

## Data-informed inference for physical systems

See Physics-based Deep Learning (Thuerey et al. 2021). Also, see Brunton and Kutz’s Data-Driven Science and Engineering web material around their book (Brunton and Kutz 2019). Also, the seminar series by the authors of that latter book is a moving feast of the latest results in this area. For Neural dynamics in particular, Patrick Kidger’s thesis seems good (Kidger 2022).

## ML for PDEs

See ML PDEs.

## Causality, identifiability, and observational data

One ML-flavoured notion is the use of observational data to derive the models. Presumably if I am modelling an entire ocean or even river, doing experiments is out of the question for reasons of cost and ethics, and the overall model will be calibrated with observational data. We need to wait until there is a flood to see what floods do. This is generally done badly in ML, but there are formalisms for it, as seen in graphical models for causal inference. Can we workout the confounders and do counterfactual inference? Is imposing an arrow of causation already doing some work for us?

Small subsystems might be informed by experiments, of course.

## Likelihood free inference

Popular if you have a simulator that can simulate from the system. See likelihood free inference.

## Emulation approaches

## The other direction: What does physics say about learning?

See why does deep learning work or the statistical mechanics of statistics.

Related, maybe: the recovery phase transitions in compressed sensing.

## But statistics is ML

Why not “*statistics* for physical sciences”?
Isn’t ML just statistics?
Why thanks, Dan, for asking that.
*Yes it is*, as far as content goes.
But the different disciplines licence different uses of the tools.
Pragmatically, using predictive modelling tools that ML practitioners advocate has been helpful in doing better statistics for ML.
When we talk about statistics in physical processes we tend to think of your grandpappy’s statistics, parametric methods where the parameters are the parameters of physical laws.
The modern emphasis in machine learning is in nonparametric, overparameterised or approximate methods that do no necessarily correspond to the world in any interpretable way.
Deep learning etc.
But sure, that is still statistics if you like.
I would have needed to spend more words explaining that though, and buried the lede.

## Incoming

- [2305.20053] Efficient PDE-Constrained optimization under high-dimensional uncertainty using derivative-informed neural operators
- Microsoft Research branded: AI4Science to empower the fifth paradigm of scientific discovery
- ML for Physics and Physics for ML Tutorial
- Physics-constrained machine learning for scientific computing - Amazon Science

## References

*Acta Numerica*30 (May): 1–86.

*Water Resources Research*51 (8): 5957–73.

*Advances In Neural Information Processing Systems*, 12.

*Biology Letters*14 (5): 20170660.

*Proceedings of the National Academy of Sciences*116 (31): 15344–49.

*Journal of Nonlinear Science*29 (4): 1563–1619.

*Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control*. Cambridge: Cambridge University Press.

*Proceedings of the National Academy of Sciences*113 (15): 3932–37.

*Science*355 (6325): 602–6.

*Proceedings of ICLR*.

*Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS)*, 6.

*Journal of Hydrology*564 (September): 191–207.

*arXiv:2110.13041 [Physics]*, October.

*International Journal of Wildland Fire*23 (1): 46.

*Acta Numerica*30 (May): 445–554.

*Computer Methods in Applied Mechanics and Engineering*375 (March): 113533.

*Journal of Agricultural, Biological and Environmental Statistics*23 (1): 39–62.

*Frontiers in Environmental Science*3 (April).

*arXiv:2012.11857 [Cs, Math, Stat]*, December.

*Advances in Water Resources*141 (July): 103610.

*Frontiers in Applied Mathematics and Statistics*7.

*ICLR*, 5.

*ICLR*.

*ACM Transactions on Graphics*38 (6): 1–16.

*Nature Reviews Physics*3 (6): 422–40.

*Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences*379 (2194): 20200093.

*arXiv:2001.08055 [Physics, Stat]*, January.

*Water*12 (1): 96.

*Physica D: Nonlinear Phenomena*417 (March): 132830.

*arXiv:2107.10127 [Math, Stat]*, July.

*arXiv:2007.00631 [Cs, Stat]*, July.

*Journal of the American Statistical Association*0 (0): 1–18.

*arXiv:1605.01156 [Cs]*, May.

*Geoscientific Model Development*12 (5): 1791–1807.

*SIAM Review*63 (1): 208–28.

*arXiv:2107.10879 [Physics]*, July.

*arXiv:2107.11253 [Nlin, Physics:physics, Stat]*, July.

*Npj Computational Materials*2 (1): 1.

*arXiv:2203.16797 [Cs, Stat]*, March.

*Neural Networks*, Computational Intelligence in Earth and Environmental Sciences, 20 (4): 462–78.

*Water Resources Research*53 (12): 10802–23.

*Probabilistic Engineering Mechanics*57 (July): 14–25.

*arXiv:1910.01751 [Cs, Stat]*, October.

*Advances In Neural Information Processing Systems*.

*Advances In Neural Information Processing Systems*, 8.

*Physical Review Letters*120 (2): 024102.

*Chaos: An Interdisciplinary Journal of Nonlinear Science*27 (12): 121102.

*Journal of Computational Physics*477 (March): 111902.

*Physica D: Nonlinear Phenomena*406 (May): 132401.

*Journal of Open Source Software*8 (89): 5510.

*MIT Web Domain*, 6.

*arXiv:2003.11755 [Cs, Stat]*, March.

*Journal of Computational Physics*378 (February): 686–707.

*Science*367 (6481): 1026–30.

*arXiv:2109.07573 [Physics]*, September.

*Environmental Modelling & Software*144 (October): 105159.

*Water Resources Research*48 (7).

*Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS)*, 6.

*arXiv:1910.09349 [Cs, Stat]*, March.

*Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS)*, 11.

*Proceedings of the 37th International Conference on Machine Learning*, 8459–68. PMLR.

*Journal of Computational and Theoretical Nanoscience*6 (10): 2283–97.

*NeurIPS*, 5.

*Artificial Neural Networks and Machine Learning – ICANN 2011*, edited by Timo Honkela, Włodzisław Duch, Mark Girolami, and Samuel Kaski, 6792:151–58. Lecture Notes in Computer Science. Berlin, Heidelberg: Springer.

*Water Resources Research*56 (3).

*arXiv:2104.04764 [Physics]*, April.

*arXiv:2006.15641 [Cs, Stat]*, June.

*Physics-Based Deep Learning*. WWW.

*Proceedings of the 34th International Conference on Machine Learning - Volume 70*, 3424–33. ICML’17. Sydney, NSW, Australia: JMLR.org.

*44th AIAA Aerospace Sciences Meeting and Exhibit*. American Institute of Aeronautics and Astronautics.

*SIAM Journal on Scientific Computing*42 (1): A292–317.

*Journal of Hydrology*, August, 125351.

*Spatial Statistics*37 (June): 100408.

*Journal of Computational Physics*411 (June): 109409.

*SIAM Journal on Scientific Computing*42 (2): A639–65.

*Journal of Computational Physics*397 (November): 108850.

*Journal of Computational Physics*394 (October): 56–81.

## No comments yet. Why not leave one?