Machine learning for physical sciences

Turbulent mixing at the boundary between two disciplines with differing inertia and viscosity


Consider a spherical flame

In physics, typically, we are concerned with identifying True Parameters for Universal Laws, applicable without prejudice across all the cosmos. We are hunting something like the Platonic ideals that our experiments are poor shadows of. Especially, say, quantum physics or cosmology.

In machine learning, typically we want to make generic predictions for a given process, and quantify how good those predictions can be given how much data we have and the approximate kind of process we witness, and there is no notion of universal truth waiting around the corner to back up our wild fancies. On the other hand, we are less concerned about the noisy sublunary chaos of experiments and don’t need to worry about how far our noise drives us from universal truth as long as we make good predictions in the local problem at hand. But here, far from universality, we have weak and vague notions of how to generalise our models to new circumstances and new noise. That is, in the Platonic ideal of machine learning, there are no Platonic ideals to be found.

(This explanation does no justice to either physics or machine learning, but this is framing rather than an essay in the history or philosophy of science.)

Can these areas have something to say to one another nevertheless? After an interesting conversation with Shane Keating about the difficulties of ocean dynamics, I am thinking about this in a new way; Generally, we might have notions from physics of what “truly” underlies a system, but where many unknown parameters, noisy measurements, computational intractability and complex or chaotic dynamics interfere with our ability to predict things using only known laws of physics; Here, we want to come up with a “best possible” stochastic model of a system given our uncertainties and constraints, which looks more like ML problem.

At a basic level, it’s not controversial (I don’t think?) to use machine learning methods to analyse data in experiments, even with trendy deep neural networks. I understand that this is significant, e.g. in connectomics.

Perhaps a little more fringe is using machine learning to reduce computational burden via surrogate models, e.g. Carleo and Troyer (2017).

The thing that is especially interesting to me is learning the whole model from ML formalism, using physical laws as input to the learning process.

To be concrete, Shane specifically was discussing problems in predicting and interpolating “tracers”, such as chemical or heat, in oceanographic flows. Here we know lots of things about the fluids concerned, but less about the details of the ocean floor and have imperfect measurements of the details. Nonetheless, we also know that there are certain invariants, conservation laws etc, so a truly “nonparametric” approach to dynamics is certainly throwing away information.

There is some cute work in learning approximations to physics, like the SINDy method, which is somehow at the intersection compressive-sensing, state filtes and Loopman operators (Brunton, Proctor, and Kutz 2016); but it’s hard to imagine scaling this up (at least directly) to big things like large image sensor arrays and other such weakly structured input. Update: Everyone is trying to scale this up these days. See Koopman operators

Researchers like Chang et al. (2017) claim that learning “compositional object” models should be possible. The compositional models here are learnable objects with learnable pairwise interactions, and bear a passing resemblance to something like the physical laws that physics experiments hope to discover, although I’m not yet totally persuaded about the details of this particular framework. On the other hand, unmotivated appealing to autoencoders as descriptions of underlying dynamics of physical reality doesn’t seem sufficient.

There is an O’Reilly podcast and reflist about deep learning for science in particular. There was a special track for papers in this area in NeurIPS.

Related: “sciml” which often seems to mean learning ODEs in particular, is important here. See various SciML conferences, e.g. ICERM

CNN classification of atmospheric rivers

Sample images of atmospheric rivers correctly classified (true positive) by our deep CNN model. Figure shows total column water vapor (color map) and land sea boundary (solid line). Y. Liu et al. (2016)

ML for PDEs

See ML PDEs.

Causality, identifiability, and observational data

One ML-flavoured notion here is the use of observational data to derive the models. Presumably if I am modelling an entire ocean or even river, doing experiments is out of the question for reasons of cost and ethics, and the overall model will be calibrated with observational data. We will need to wait until there is a flood to see what floods do. This is generally done badly in ML, but there are formalisms for it, as seen in graphical models for causal inference. Can we workout the confounders and do counterfactual inference? Is imposing an arrow of causation already doing some work for us here?

Small subsystems might be informed by experiments, of course.

Likelihood free inference

Popular if you have a simulator that can simulate from the system. See likelihood free inference.

Emulation approaches

See Emulation and surrogates.

The other direction: What does physics say about learning?

See why does deep learning work or the statistical mechanics of statistics.

Related, maybe: the recovery phase transitions in compressed sensing.

But statistics is ML

Why not “statistics for physical sciences”? Isn’t ML just statistics? Why thanks, Dan, for asking that. I would argue yes it is, but the emphasis is different. When we talk about statistics in physical processes we tend to think of your grandpappy’s statistics, parametric methods where the parameters are the parameters of physical laws. The modern emphasis in machine learning is in nonparametric, overparameterised or approximate methods that do no necessarily correspond to the world in any interpretable way. Deep learning etc. But sure, that is still statistics if you like. I would have needed to spend more words explaining that though, and buried the lede.

Applications

bushfires, hydrology, climate models, molecular dynamics…

References

Altosaar, Jaan, Rajesh Ranganath, and Kyle Cranmer. 2019. “Hierarchical Variational Models for Statistical Physics.” In, 5.
Asher, M. J., B. F. W. Croke, A. J. Jakeman, and L. J. M. Peeters. 2015. “A Review of Surrogate Models and Their Application to Groundwater Modeling.” Water Resources Research 51 (8): 5957–73. https://doi.org/10.1002/2015WR016967.
Atkinson, Steven, Waad Subber, and Liping Wang. 2019. “Data-Driven Discovery of Free-Form Governing Differential Equations.” In, 7.
Ayed, Ibrahim, and Emmanuel de Bézenac. 2019. “Learning Dynamical Systems from Partial Observations.” In Advances In Neural Information Processing Systems, 12.
Baker, Ruth E., Jose-Maria Peña, Jayaratnam Jayamohan, and Antoine Jérusalem. 2018. “Mechanistic Models Versus Machine Learning, a Fight Worth Fighting for the Biological Community?” Biology Letters 14 (5): 20170660. https://doi.org/10.1098/rsbl.2017.0660.
Beck, Christian, Weinan E, and Arnulf Jentzen. 2019. “Machine Learning Approximation Algorithms for High-Dimensional Fully Nonlinear Partial Differential Equations and Second-Order Backward Stochastic Differential Equations.” Journal of Nonlinear Science 29 (4): 1563–1619. https://doi.org/10.1007/s00332-018-9525-3.
Brehmer, Johann, Kyle Cranmer, Siddharth Mishra-Sharma, Felix Kling, and Gilles Louppe. 2019. “Mining Gold: Improving Simulation-Based Inference with Latent Information.” In, 7.
Brunton, Steven L., Joshua L. Proctor, and J. Nathan Kutz. 2016. “Discovering Governing Equations from Data by Sparse Identification of Nonlinear Dynamical Systems.” Proceedings of the National Academy of Sciences 113 (15): 3932–37. https://doi.org/10.1073/pnas.1517384113.
Carleo, Giuseppe, and Matthias Troyer. 2017. “Solving the Quantum Many-Body Problem with Artificial Neural Networks.” Science 355 (6325): 602–6. https://doi.org/10.1126/science.aag2302.
Chang, Michael B., Tomer Ullman, Antonio Torralba, and Joshua B. Tenenbaum. 2017. “A Compositional Object-Based Approach to Learning Physical Dynamics.” In Proceedings of ICLR. http://arxiv.org/abs/1612.00341.
Cranmer, Miles D, Rui Xu, Peter Battaglia, and Shirley Ho. 2019. “Learning Symbolic Physics with Graph Networks.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS), 6.
Cui, Tao, Luk Peeters, Dan Pagendam, Trevor Pickett, Huidong Jin, Russell S. Crosbie, Matthias Raiber, David W. Rassam, and Mat Gilfedder. 2018. “Emulator-Enabled Approximate Bayesian Computation (ABC) and Uncertainty Analysis for Computationally Expensive Groundwater Models.” Journal of Hydrology 564 (September): 191–207. https://doi.org/10.1016/j.jhydrol.2018.07.005.
Filippi, Jean-Baptiste, Vivien Mallet, and Bahaa Nader. 2014. “Representation and Evaluation of Wildfire Propagation Simulations.” International Journal of Wildland Fire 23 (1): 46. https://doi.org/10.1071/WF12202.
Girolami, Mark, Eky Febrianto, Ge Yin, and Fehmi Cirak. 2020. “The Statistical Finite Element Method (statFEM) for Coherent Synthesis of Observation Data and Model Predictions.” July 30, 2020. http://arxiv.org/abs/1905.06391.
Gladish, Daniel W., Daniel E. Pagendam, Luk J. M. Peeters, Petra M. Kuhnert, and Jai Vaze. 2018. “Emulation Engines: Choice and Quantification of Uncertainty for Complex Hydrological Models.” Journal of Agricultural, Biological and Environmental Statistics 23 (1): 39–62. https://doi.org/10.1007/s13253-017-0308-3.
Goldstein, Evan B., and Giovanni Coco. 2015. “Machine Learning Components in Deterministic Models: Hybrid Synergy in the Age of Data.” Frontiers in Environmental Science 3 (April). https://doi.org/10.3389/fenvs.2015.00033.
Gulian, Mamikon, Ari Frankel, and Laura Swiler. 2020. “Gaussian Process Regression Constrained by Boundary Value Problems.” December 22, 2020. http://arxiv.org/abs/2012.11857.
He, QiZhi, David Barajas-Solano, Guzel Tartakovsky, and Alexandre M. Tartakovsky. 2020. “Physics-Informed Neural Networks for Multiphysics Data Assimilation with Application to Subsurface Transport.” Advances in Water Resources 141 (July): 103610. https://doi.org/10.1016/j.advwatres.2020.103610.
Holl, Philipp, Nils Thuerey, and Vladlen Koltun. 2019. “Learning to Control PDEs with Differentiable Physics.” In, 5.
Hu, Yuanming, Luke Anderson, Tzu-Mao Li, Qi Sun, Nathan Carr, Jonathan Ragan-Kelley, and Frédo Durand. 2020. DiffTaichi: Differentiable Programming for Physical Simulation.” In ICLR. http://arxiv.org/abs/1910.00935.
Hu, Yuanming, Tzu-Mao Li, Luke Anderson, Jonathan Ragan-Kelley, and Frédo Durand. 2019. “Taichi: A Language for High-Performance Computation on Spatially Sparse Data Structures.” ACM Transactions on Graphics 38 (6): 1–16. https://doi.org/10.1145/3355089.3356506.
Kasim, M. F., D. Watson-Parris, L. Deaconu, S. Oliver, P. Hatfield, D. H. Froula, G. Gregori, et al. 2020. “Up to Two Billion Times Acceleration of Scientific Simulations with Deep Neural Architecture Search.” January 17, 2020. http://arxiv.org/abs/2001.08055.
Kasim, Muhammad, J Topp-Mugglestone, P Hatfield, D H Froula, G Gregori, M Jarvis, E Viezzer, and Sam Vinko. 2019. “A Million Times Speed up in Parameters Retrieval with Deep Learning.” In, 5.
Kimura, Nobuaki, Ikuo Yoshinaga, Kenji Sekijima, Issaku Azechi, and Daichi Baba. 2020. “Convolutional Neural Network Coupled with a Transfer-Learning Approach for Time-Series Flood Predictions.” Water 12 (1, 1): 96. https://doi.org/10.3390/w12010096.
Li, Yunzhu, Antonio Torralba, Animashree Anandkumar, Dieter Fox, and Animesh Garg. 2020. “Causal Discovery in Physical Systems from Videos.” July 2, 2020. http://arxiv.org/abs/2007.00631.
Liu, Xiao, Kyongmin Yeo, and Siyuan Lu. 2020. “Statistical Modeling for Spatio-Temporal Data From Stochastic Convection-Diffusion Processes.” Journal of the American Statistical Association 0 (0): 1–18. https://doi.org/10.1080/01621459.2020.1863223.
Liu, Yunjie, Evan Racah, Prabhat, Joaquin Correa, Amir Khosrowshahi, David Lavers, Kenneth Kunkel, Michael Wehner, and William Collins. 2016. “Application of Deep Convolutional Neural Networks for Detecting Extreme Weather in Climate Datasets.” May 4, 2016. http://arxiv.org/abs/1605.01156.
Lu, Dan, and Daniel Ricciuto. 2019. “Efficient Surrogate Modeling Methods for Large-Scale Earth System Models Based on Machine-Learning Techniques.” Geoscientific Model Development 12 (5): 1791–1807. https://doi.org/10.5194/gmd-12-1791-2019.
Lu, Lu, Zhiping Mao, and Xuhui Meng. 2019. DeepXDE: A Deep Learning Library for Solving Differential Equations.” In, 6. http://arxiv.org/abs/1907.04502.
Medasani, Bharat, Anthony Gamst, Hong Ding, Wei Chen, Kristin A. Persson, Mark Asta, Andrew Canning, and Maciej Haranczyk. 2016. “Predicting Defect Behavior in B2 Intermetallics by Merging Ab Initio Modeling and Machine Learning.” Npj Computational Materials 2 (1): 1. https://doi.org/10.1038/s41524-016-0001-z.
Merwe, Rudolph van der, Todd K. Leen, Zhengdong Lu, Sergey Frolov, and Antonio M. Baptista. 2007. “Fast Neural Network Surrogates for Very High Dimensional Physics-Based Models in Computational Oceanography.” Neural Networks, Computational Intelligence in Earth and Environmental Sciences, 20 (4): 462–78. https://doi.org/10.1016/j.neunet.2007.04.023.
Mo, Shaoxing, Dan Lu, Xiaoqing Shi, Guannan Zhang, Ming Ye, Jianfeng Wu, and Jichun Wu. 2017. “A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling.” Water Resources Research 53 (12): 10802–23. https://doi.org/10.1002/2017WR021622.
Nabian, Mohammad Amin, and Hadi Meidani. 2019. “A Deep Learning Solution Approach for High-Dimensional Random Differential Equations.” Probabilistic Engineering Mechanics 57 (July): 14–25. https://doi.org/10.1016/j.probengmech.2019.05.001.
Nair, Suraj, Yuke Zhu, Silvio Savarese, and Li Fei-Fei. 2019. “Causal Induction from Visual Observations for Goal Directed Tasks.” October 3, 2019. http://arxiv.org/abs/1910.01751.
Ng, Ignavier, Shengyu Zhu, Zhitang Chen, and Zhuangyan Fang. 2019. “A Graph Autoencoder Approach to Causal Structure Learning.” In Advances In Neural Information Processing Systems. http://arxiv.org/abs/1911.07420.
Paleyes, Andrei, Mark Pullin, Maren Mahsereci, Neil Lawrence, and Javier Gonzalez. 2019. “Emulation of Physical Processes with Emukit.” In Advances In Neural Information Processing Systems, 8. https://ml4physicalsciences.github.io/files/NeurIPS_ML4PS_2019_113.pdf.
Park, Ji Hwan, Shinjae Yoo, and Balu Nadiga. 2019. “Machine Learning Climate Variability.” In, 5.
Partee, Sam, Michael Ringenburg, Benjamin Robbins, and Andrew Shao. 2019. “Model Parameter Optimization: ML-Guided Trans-Resolution Tuning of Physical Models.” In. Zenodo.
Pathak, Jaideep, Brian Hunt, Michelle Girvan, Zhixin Lu, and Edward Ott. 2018. “Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach.” Physical Review Letters 120 (2): 024102. https://doi.org/10.1103/PhysRevLett.120.024102.
Pathak, Jaideep, Zhixin Lu, Brian R. Hunt, Michelle Girvan, and Edward Ott. 2017. “Using Machine Learning to Replicate Chaotic Attractors and Calculate Lyapunov Exponents from Data.” Chaos: An Interdisciplinary Journal of Nonlinear Science 27 (12): 121102. https://doi.org/10.1063/1.5010300.
Portwood, Gavin D, Peetak P Mitra, Mateus Dias Ribeiro, Tan Minh Nguyen, Balasubramanya T Nadiga, Juan A Saenz, Michael Chertkov, and Animesh Garg. 2019. “Turbulence Forecasting via Neural ODE.” In, 7.
Qian, Elizabeth, Boris Kramer, Benjamin Peherstorfer, and Karen Willcox. 2020. “Lift & Learn: Physics-Informed Machine Learning for Large-Scale Nonlinear Dynamical Systems.” Physica D: Nonlinear Phenomena 406 (May): 132401. https://doi.org/10.1016/j.physd.2020.132401.
Raghu, Maithra, and Eric Schmidt. 2020. “A Survey of Deep Learning for Scientific Discovery.” March 26, 2020. http://arxiv.org/abs/2003.11755.
Raissi, Maziar, P. Perdikaris, and George Em Karniadakis. 2019. “Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations.” Journal of Computational Physics 378 (February): 686–707. https://doi.org/10.1016/j.jcp.2018.10.045.
Raissi, Maziar, Alireza Yazdani, and George Em Karniadakis. 2020. “Hidden Fluid Mechanics: Learning Velocity and Pressure Fields from Flow Visualizations.” Science 367 (6481): 1026–30. https://doi.org/10.1126/science.aaw4741.
Razavi, Saman, Bryan A. Tolson, and Donald H. Burn. 2012. “Review of Surrogate Modeling in Water Resources.” Water Resources Research 48 (7). https://doi.org/10.1029/2011WR011527.
Rezende, Danilo J, Sébastien Racanière, Irina Higgins, and Peter Toth. 2019. “Equivariant Hamiltonian Flows.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS), 6.
Saemundsson, Steindor, Alexander Terenin, Katja Hofmann, and Marc Peter Deisenroth. 2020. “Variational Integrator Networks for Physically Structured Embeddings.” March 2, 2020. http://arxiv.org/abs/1910.09349.
Sanchez-Gonzalez, Alvaro, Victor Bapst, Peter Battaglia, and Kyle Cranmer. 2019. “Hamiltonian Graph Networks with ODE Integrators.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS), 11.
Sanchez-Gonzalez, Alvaro, Jonathan Godwin, Tobias Pfaff, Rex Ying, Jure Leskovec, and Peter W. Battaglia. 2020. “Learning to Simulate Complex Physics with Graph Networks.” In. http://arxiv.org/abs/2002.09405.
Sargsyan, Khachik, Bert Debusschere, Habib Najm, and Youssef Marzouk. 2009. “Bayesian Inference of Spectral Expansions for Predictability Assessment in Stochastic Reaction Networks.” Journal of Computational and Theoretical Nanoscience 6 (10): 2283–97. https://doi.org/10.1166/jctn.2009.1285.
Sarkar, Soumalya, and Michael Joly. 2019. “Multi-Fidelity Learning with Heterogeneous Domains.” In, 5.
Särkkä, Simo. 2011. “Linear Operators and Stochastic Partial Differential Equations in Gaussian Process Regression.” In Artificial Neural Networks and Machine LearningICANN 2011, edited by Timo Honkela, Włodzisław Duch, Mark Girolami, and Samuel Kaski, 6792:151–58. Lecture Notes in Computer Science. Berlin, Heidelberg: Springer. https://doi.org/10.1007/978-3-642-21738-8_20.
Siade, Adam J., Tao Cui, Robert N. Karelse, and Clive Hampton. 2020. “Reduced‐Dimensional Gaussian Process Machine Learning for Groundwater Allocation Planning Using Swarm Theory.” Water Resources Research 56 (3). https://doi.org/10.1029/2019WR026061.
Tait, Daniel J., and Theodoros Damoulas. 2020. “Variational Autoencoding of PDE Inverse Problems.” June 28, 2020. http://arxiv.org/abs/2006.15641.
Tartakovsky, Alexandre M., Carlos Ortiz Marrero, Paris Perdikaris, Guzel D. Tartakovsky, and David Barajas-Solano. 2018. “Learning Parameters and Constitutive Relationships with Physics Informed Deep Neural Networks,” August. https://arxiv.org/abs/1808.03398v2.
Tompson, Jonathan, Kristofer Schlachter, Pablo Sprechmann, and Ken Perlin. 2017. “Accelerating Eulerian Fluid Simulation with Convolutional Networks.” In Proceedings of the 34th International Conference on Machine Learning - Volume 70, 3424–33. ICML’17. Sydney, NSW, Australia: JMLR.org. http://proceedings.mlr.press/v70/tompson17a.html.
Witteveen, Jeroen A. S., and Hester Bijl. 2006. “Modeling Arbitrary Uncertainties Using Gram-Schmidt Polynomial Chaos.” In 44th AIAA Aerospace Sciences Meeting and Exhibit. American Institute of Aeronautics and Astronautics. https://doi.org/10.2514/6.2006-896.
Yang, Liu, Dongkun Zhang, and George Em Karniadakis. 2020. “Physics-Informed Generative Adversarial Networks for Stochastic Differential Equations.” SIAM Journal on Scientific Computing 42 (1): A292–317. https://doi.org/10.1137/18M1225409.
Yu, Xiayang, Tao Cui, J. Sreekanth, Stephane Mangeon, Rebecca Doble, Pei Xin, David Rassam, and Mat Gilfedder. 2020. “Deep Learning Emulators for Groundwater Contaminant Transport Modelling.” Journal of Hydrology, August, 125351. https://doi.org/10.1016/j.jhydrol.2020.125351.
Zammit-Mangion, Andrew, and Christopher K. Wikle. 2020. “Deep Integro-Difference Equation Models for Spatio-Temporal Forecasting.” Spatial Statistics 37 (June): 100408. https://doi.org/10.1016/j.spasta.2020.100408.
Zang, Yaohua, Gang Bao, Xiaojing Ye, and Haomin Zhou. 2020. “Weak Adversarial Networks for High-Dimensional Partial Differential Equations.” Journal of Computational Physics 411 (June): 109409. https://doi.org/10.1016/j.jcp.2020.109409.
Zhang, Dongkun, Ling Guo, and George Em Karniadakis. 2020. “Learning in Modal Space: Solving Time-Dependent Stochastic PDEs Using Physics-Informed Neural Networks.” SIAM Journal on Scientific Computing 42 (2): A639–65. https://doi.org/10.1137/19M1260141.
Zhang, Dongkun, Lu Lu, Ling Guo, and George Em Karniadakis. 2019. “Quantifying Total Uncertainty in Physics-Informed Neural Networks for Solving Forward and Inverse Stochastic Problems.” Journal of Computational Physics 397 (November): 108850. https://doi.org/10.1016/j.jcp.2019.07.048.
Zhu, Yinhao, Nicholas Zabaras, Phaedon-Stelios Koutsourelakis, and Paris Perdikaris. 2019. “Physics-Constrained Deep Learning for High-Dimensional Surrogate Modeling and Uncertainty Quantification Without Labeled Data.” Journal of Computational Physics 394 (October): 56–81. https://doi.org/10.1016/j.jcp.2019.05.024.

Warning! Experimental comments system! If is does not work for you, let me know via the contact form.

No comments yet!

GitHub-flavored Markdown & a sane subset of HTML is supported.