Machine learning for physical sciences

Turbulent mixing at the boundary between two disciplines with differing inertia and viscosity



Consider a spherical flame

In physics, typically, we are concerned with identifying True Parameters for Universal Laws, applicable without prejudice across all the cosmos. We are hunting something like the Platonic ideals that our experiments are poor shadows of. Especially, say, quantum physics or cosmology.

In machine learning, typically we want to make generic predictions for a given process, and quantify how good those predictions can be given how much data we have and the approximate kind of process we witness, and there is no notion of universal truth waiting around the corner to back up our wild fancies. On the other hand, we are less concerned about the noisy sublunary chaos of experiments and don’t need to worry about how far our noise drives us from universal truth as long as we make good predictions in the local problem at hand. But here, far from universality, we have weak and vague notions of how to generalise our models to new circumstances and new noise. That is, in the Platonic ideal of machine learning, there are no Platonic ideals to be found.

(This explanation does no justice to either physics or machine learning, but it will do as framing rather than getting too deep into the history or philosophy of science.)

Can these areas have something to say to one another nevertheless? After an interesting conversation with Shane Keating about the difficulties of ocean dynamics, I am thinking about this in a new way; Generally, we might have notions from physics of what “truly” underlies a system, but where many unknown parameters, noisy measurements, computational intractability and complex or chaotic dynamics interfere with our ability to predict things using only known laws of physics; Here, we want to come up with a “best possible” stochastic model of a system given our uncertainties and constraints, which looks more like ML problem.

At a basic level, it’s not controversial (I don’t think?) to use machine learning methods to analyse data in experiments, even with trendy deep neural networks. I understand that this is significant, e.g. in connectomics.

Perhaps a little more fringe is using machine learning to reduce computational burden via surrogate models, e.g. Carleo and Troyer (2017).

The thing that is especially interesting to me right now is learning the whole model in ML formalism, using physical laws as input to the learning process.

To be concrete, Shane specifically was discussing problems in predicting and interpolating “tracers”, such as chemical or heat, in oceanographic flows. Here we know lots of things about the fluids concerned, but less about the details of the ocean floor and have imperfect measurements of the details. Nonetheless, we also know that there are certain invariants, conservation laws etc, so a truly “nonparametric” approach to dynamics is certainly throwing away information.

There is some cute work in learning approximations to physics, like the SINDy method, which is somehow at the intersection compressive-sensing, state filters and maybe even Koopman operators (Brunton, Proctor, and Kutz 2016); but it’s hard to imagine scaling this up (at least directly) to big things like large image sensor arrays and other such weakly structured input.

Researchers like Chang et al. (2017) claim that learning “compositional object” models should be possible. The compositional models are learnable objects with learnable pairwise interactions, and bear a passing resemblance to something like the physical laws that physics experiments hope to discover, although I’m not yet totally persuaded about the details of this particular framework. On the other hand, unmotivated appealing to autoencoders as descriptions of underlying dynamics of physical reality doesn’t seem sufficient.

There is an O’Reilly podcast and reflist about deep learning for science in particular. There was a special track for papers in this area in NeurIPS.

Related: “sciml” which often seems to mean learning ODEs in particular, is important. See various SciML conferences, e.g. ICERM

CNN classification of atmospheric rivers

Sample images of atmospheric rivers correctly classified (true positive) by our deep CNN model. Figure shows total column water vapor (color map) and land sea boundary (solid line). Y. Liu et al. (2016)

Data-informed inference for physical systems

See Physics-based Deep Learning (Thuerey et al. 2021). Also, see Brunton and Kutz’s Data-Driven Science and Engineering web material around their book (Brunton and Kutz 2019). Also, the seminar series by the authors of that latter book is a moving feast of the latest results in this area. For Neural dynamics in particular, Patrick Kidger’s thesis seems good (Kidger 2022).

ML for PDEs

See ML PDEs.

Causality, identifiability, and observational data

One ML-flavoured notion is the use of observational data to derive the models. Presumably if I am modelling an entire ocean or even river, doing experiments is out of the question for reasons of cost and ethics, and the overall model will be calibrated with observational data. We need to wait until there is a flood to see what floods do. This is generally done badly in ML, but there are formalisms for it, as seen in graphical models for causal inference. Can we workout the confounders and do counterfactual inference? Is imposing an arrow of causation already doing some work for us?

Small subsystems might be informed by experiments, of course.

Likelihood free inference

Popular if you have a simulator that can simulate from the system. See likelihood free inference.

Emulation approaches

See Emulation and surrogates.

The other direction: What does physics say about learning?

See why does deep learning work or the statistical mechanics of statistics.

Related, maybe: the recovery phase transitions in compressed sensing.

But statistics is ML

Why not “statistics for physical sciences”? Isn’t ML just statistics? Why thanks, Dan, for asking that. Yes it is, as far as content goes. But the different disciplines licence different uses of the tools. Pragmatically, using predictive modelling tools that ML practitioners advocate has been helpful in doing better statistics for ML. When we talk about statistics in physical processes we tend to think of your grandpappy’s statistics, parametric methods where the parameters are the parameters of physical laws. The modern emphasis in machine learning is in nonparametric, overparameterised or approximate methods that do no necessarily correspond to the world in any interpretable way. Deep learning etc. But sure, that is still statistics if you like. I would have needed to spend more words explaining that though, and buried the lede.

Applications

Bushfires, hydrology, climate models, molecular dynamics…

References

Altmann, Robert, Patrick Henning, and Daniel Peterseim. 2021. Numerical Homogenization Beyond Scale Separation.” Acta Numerica 30 (May): 1–86.
Altosaar, Jaan, Rajesh Ranganath, and Kyle Cranmer. 2019. “Hierarchical Variational Models for Statistical Physics.” In, 5.
Asher, M. J., B. F. W. Croke, A. J. Jakeman, and L. J. M. Peeters. 2015. A Review of Surrogate Models and Their Application to Groundwater Modeling.” Water Resources Research 51 (8): 5957–73.
Atkinson, Steven, Waad Subber, and Liping Wang. 2019. “Data-Driven Discovery of Free-Form Governing Differential Equations.” In, 7.
Auzina, Ilze Amanda, Cagatay Yildiz, and Efstratios Gavves. 2022. Latent GP-ODEs with Informative Priors.” In.
Ayed, Ibrahim, and Emmanuel de Bézenac. 2019. “Learning Dynamical Systems from Partial Observations.” In Advances In Neural Information Processing Systems, 12.
Baker, Ruth E., Jose-Maria Peña, Jayaratnam Jayamohan, and Antoine Jérusalem. 2018. Mechanistic Models Versus Machine Learning, a Fight Worth Fighting for the Biological Community? Biology Letters 14 (5): 20170660.
Bar-Sinai, Yohai, Stephan Hoyer, Jason Hickey, and Michael P. Brenner. 2019. Learning Data-Driven Discretizations for Partial Differential Equations.” Proceedings of the National Academy of Sciences 116 (31): 15344–49.
Beck, Christian, Weinan E, and Arnulf Jentzen. 2019. Machine Learning Approximation Algorithms for High-Dimensional Fully Nonlinear Partial Differential Equations and Second-Order Backward Stochastic Differential Equations.” Journal of Nonlinear Science 29 (4): 1563–1619.
Bottero, Luca, Francesco Calisto, Giovanni Graziano, Valerio Pagliarino, Martina Scauda, Sara Tiengo, and Simone Azeglio. 2020. Physics-Informed Machine Learning Simulator for Wildfire Propagation,” December.
Brehmer, Johann, Kyle Cranmer, Siddharth Mishra-Sharma, Felix Kling, and Gilles Louppe. 2019. “Mining Gold: Improving Simulation-Based Inference with Latent Information.” In, 7.
Brunton, Steven L., and Jose Nathan Kutz. 2019. Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control. Cambridge: Cambridge University Press.
Brunton, Steven L., Joshua L. Proctor, and J. Nathan Kutz. 2016. Discovering Governing Equations from Data by Sparse Identification of Nonlinear Dynamical Systems.” Proceedings of the National Academy of Sciences 113 (15): 3932–37.
Carleo, Giuseppe, and Matthias Troyer. 2017. Solving the Quantum Many-Body Problem with Artificial Neural Networks.” Science 355 (6325): 602–6.
Chang, Michael B., Tomer Ullman, Antonio Torralba, and Joshua B. Tenenbaum. 2017. A Compositional Object-Based Approach to Learning Physical Dynamics.” In Proceedings of ICLR.
Cranmer, Miles D, Rui Xu, Peter Battaglia, and Shirley Ho. 2019. “Learning Symbolic Physics with Graph Networks.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS), 6.
Cui, Tao, Luk Peeters, Dan Pagendam, Trevor Pickett, Huidong Jin, Russell S. Crosbie, Matthias Raiber, David W. Rassam, and Mat Gilfedder. 2018. Emulator-Enabled Approximate Bayesian Computation (ABC) and Uncertainty Analysis for Computationally Expensive Groundwater Models.” Journal of Hydrology 564 (September): 191–207.
Deiana, Allison McCarn, Nhan Tran, Joshua Agar, Michaela Blott, Giuseppe Di Guglielmo, Javier Duarte, Philip Harris, et al. 2021. Applications and Techniques for Fast Machine Learning in Science.” arXiv:2110.13041 [Physics], October.
Faroughi, Salah A., Nikhil Pawar, Celio Fernandes, Maziar Raissi, Subasish Das, Nima K. Kalantari, and Seyed Kourosh Mahjour. 2023. Physics-Guided, Physics-Informed, and Physics-Encoded Neural Networks in Scientific Computing.” arXiv.
Filippi, Jean-Baptiste, Vivien Mallet, and Bahaa Nader. 2014. Representation and Evaluation of Wildfire Propagation Simulations.” International Journal of Wildland Fire 23 (1): 46.
Gahungu, Paterne, Christopher W. Lanyon, Mauricio A. Álvarez, Engineer Bainomugisha, Michael Thomas Smith, and Richard David Wilkinson. 2022. Adjoint-Aided Inference of Gaussian Process Driven Differential Equations.” In.
Ghattas, Omar, and Karen Willcox. 2021. Learning Physics-Based Models from Data: Perspectives from Inverse Problems and Model Reduction.” Acta Numerica 30 (May): 445–554.
Girolami, Mark, Eky Febrianto, Ge Yin, and Fehmi Cirak. 2021. The Statistical Finite Element Method (statFEM) for Coherent Synthesis of Observation Data and Model Predictions.” Computer Methods in Applied Mechanics and Engineering 375 (March): 113533.
Gladish, Daniel W., Daniel E. Pagendam, Luk J. M. Peeters, Petra M. Kuhnert, and Jai Vaze. 2018. Emulation Engines: Choice and Quantification of Uncertainty for Complex Hydrological Models.” Journal of Agricultural, Biological and Environmental Statistics 23 (1): 39–62.
Goldstein, Evan B., and Giovanni Coco. 2015. Machine Learning Components in Deterministic Models: Hybrid Synergy in the Age of Data.” Frontiers in Environmental Science 3 (April).
Gulian, Mamikon, Ari Frankel, and Laura Swiler. 2020. Gaussian Process Regression Constrained by Boundary Value Problems.” arXiv:2012.11857 [Cs, Math, Stat], December.
He, QiZhi, David Barajas-Solano, Guzel Tartakovsky, and Alexandre M. Tartakovsky. 2020. Physics-Informed Neural Networks for Multiphysics Data Assimilation with Application to Subsurface Transport.” Advances in Water Resources 141 (July): 103610.
Hoffimann, Júlio, Maciel Zortea, Breno de Carvalho, and Bianca Zadrozny. 2021. Geostatistical Learning: Challenges and Opportunities.” Frontiers in Applied Mathematics and Statistics 7.
Holl, Philipp, Vladlen Koltun, and Nils Thuerey. 2022. Scale-Invariant Learning by Physics Inversion.” In.
Holl, Philipp, Nils Thuerey, and Vladlen Koltun. 2020. Learning to Control PDEs with Differentiable Physics.” In ICLR, 5.
Hu, Yuanming, Luke Anderson, Tzu-Mao Li, Qi Sun, Nathan Carr, Jonathan Ragan-Kelley, and Frédo Durand. 2020. DiffTaichi: Differentiable Programming for Physical Simulation.” In ICLR.
Hu, Yuanming, Tzu-Mao Li, Luke Anderson, Jonathan Ragan-Kelley, and Frédo Durand. 2019. Taichi: A Language for High-Performance Computation on Spatially Sparse Data Structures.” ACM Transactions on Graphics 38 (6): 1–16.
Innes, Mike, Alan Edelman, Keno Fischer, Chris Rackauckas, Elliot Saba, Viral B. Shah, and Will Tebbutt. 2019. A Differentiable Programming System to Bridge Machine Learning and Scientific Computing.” arXiv.
Jin, Hanxun, Enrui Zhang, and Horacio D. Espinosa. 2023. Recent Advances and Applications of Machine Learning in Experimental Solid Mechanics: A Review.” arXiv.
Karniadakis, George Em, Ioannis G. Kevrekidis, Lu Lu, Paris Perdikaris, Sifan Wang, and Liu Yang. 2021. Physics-Informed Machine Learning.” Nature Reviews Physics 3 (6): 422–40.
Kashinath, K., M. Mustafa, A. Albert, J-L. Wu, C. Jiang, S. Esmaeilzadeh, K. Azizzadenesheli, et al. 2021. Physics-Informed Machine Learning: Case Studies for Weather and Climate Modelling.” Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 379 (2194): 20200093.
Kasim, M. F., D. Watson-Parris, L. Deaconu, S. Oliver, P. Hatfield, D. H. Froula, G. Gregori, et al. 2020. Up to Two Billion Times Acceleration of Scientific Simulations with Deep Neural Architecture Search.” arXiv:2001.08055 [Physics, Stat], January.
Kasim, Muhammad, J Topp-Mugglestone, P Hatfield, D H Froula, G Gregori, M Jarvis, E Viezzer, and Sam Vinko. 2019. “A Million Times Speed up in Parameters Retrieval with Deep Learning.” In, 5.
Kidger, Patrick. 2022. On Neural Differential Equations.” Oxford.
Kimura, Nobuaki, Ikuo Yoshinaga, Kenji Sekijima, Issaku Azechi, and Daichi Baba. 2020. Convolutional Neural Network Coupled with a Transfer-Learning Approach for Time-Series Flood Predictions.” Water 12 (1): 96.
Krämer, Nicholas, Nathanael Bosch, Jonathan Schmidt, and Philipp Hennig. 2021. Probabilistic ODE Solutions in Millions of Dimensions.” arXiv.
Li, Yang, and Jinqiao Duan. 2021a. A Data-Driven Approach for Discovering Stochastic Dynamical Systems with Non-Gaussian Levy Noise.” Physica D: Nonlinear Phenomena 417 (March): 132830.
———. 2021b. Extracting Governing Laws from Sample Path Data of Non-Gaussian Stochastic Dynamical Systems.” arXiv:2107.10127 [Math, Stat], July.
Li, Yunzhu, Antonio Torralba, Animashree Anandkumar, Dieter Fox, and Animesh Garg. 2020. Causal Discovery in Physical Systems from Videos.” arXiv:2007.00631 [Cs, Stat], July.
Liu, Xiao, Kyongmin Yeo, and Siyuan Lu. 2020. Statistical Modeling for Spatio-Temporal Data From Stochastic Convection-Diffusion Processes.” Journal of the American Statistical Association 0 (0): 1–18.
Liu, Yunjie, Evan Racah, Prabhat, Joaquin Correa, Amir Khosrowshahi, David Lavers, Kenneth Kunkel, Michael Wehner, and William Collins. 2016. Application of Deep Convolutional Neural Networks for Detecting Extreme Weather in Climate Datasets.” arXiv:1605.01156 [Cs], May.
Long, Da, Zheng Wang, Aditi Krishnapriyan, Robert Kirby, Shandian Zhe, and Michael Mahoney. 2022. AutoIP: A United Framework to Integrate Physics into Gaussian Processes.” arXiv.
Lu, Dan, and Daniel Ricciuto. 2019. Efficient Surrogate Modeling Methods for Large-Scale Earth System Models Based on Machine-Learning Techniques.” Geoscientific Model Development 12 (5): 1791–1807.
Lu, Lu, Xuhui Meng, Zhiping Mao, and George Em Karniadakis. 2021. DeepXDE: A Deep Learning Library for Solving Differential Equations.” SIAM Review 63 (1): 208–28.
Lu, Peter Y., Joan Ariño, and Marin Soljačić. 2021. Discovering Sparse Interpretable Dynamics from Partial Observations.” arXiv:2107.10879 [Physics], July.
Malartic, Quentin, Alban Farchi, and Marc Bocquet. 2021. State, Global and Local Parameter Estimation Using Local Ensemble Kalman Filters: Applications to Online Machine Learning of Chaotic Dynamics.” arXiv:2107.11253 [Nlin, Physics:physics, Stat], July.
Medasani, Bharat, Anthony Gamst, Hong Ding, Wei Chen, Kristin A. Persson, Mark Asta, Andrew Canning, and Maciej Haranczyk. 2016. Predicting Defect Behavior in B2 Intermetallics by Merging Ab Initio Modeling and Machine Learning.” Npj Computational Materials 2 (1): 1.
Meng, Chuizheng, Sungyong Seo, Defu Cao, Sam Griesemer, and Yan Liu. 2022. When Physics Meets Machine Learning: A Survey of Physics-Informed Machine Learning.” arXiv:2203.16797 [Cs, Stat], March.
Merwe, Rudolph van der, Todd K. Leen, Zhengdong Lu, Sergey Frolov, and Antonio M. Baptista. 2007. Fast Neural Network Surrogates for Very High Dimensional Physics-Based Models in Computational Oceanography.” Neural Networks, Computational Intelligence in Earth and Environmental Sciences, 20 (4): 462–78.
Mo, Shaoxing, Dan Lu, Xiaoqing Shi, Guannan Zhang, Ming Ye, Jianfeng Wu, and Jichun Wu. 2017. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling.” Water Resources Research 53 (12): 10802–23.
Nabian, Mohammad Amin, and Hadi Meidani. 2019. A Deep Learning Solution Approach for High-Dimensional Random Differential Equations.” Probabilistic Engineering Mechanics 57 (July): 14–25.
Nair, Suraj, Yuke Zhu, Silvio Savarese, and Li Fei-Fei. 2019. Causal Induction from Visual Observations for Goal Directed Tasks.” arXiv:1910.01751 [Cs, Stat], October.
Ng, Ignavier, Shengyu Zhu, Zhitang Chen, and Zhuangyan Fang. 2019. A Graph Autoencoder Approach to Causal Structure Learning.” In Advances In Neural Information Processing Systems.
Otness, Karl, Arvi Gjoka, Joan Bruna, Daniele Panozzo, Benjamin Peherstorfer, Teseo Schneider, and Denis Zorin. 2021. An Extensible Benchmark Suite for Learning to Simulate Physical Systems.” In.
Paleyes, Andrei, Mark Pullin, Maren Mahsereci, Neil Lawrence, and Javier Gonzalez. 2019. Emulation of Physical Processes with Emukit.” In Advances In Neural Information Processing Systems, 8.
Park, Ji Hwan, Shinjae Yoo, and Balu Nadiga. 2019. “Machine Learning Climate Variability.” In, 5.
Partee, Sam, Michael Ringenburg, Benjamin Robbins, and Andrew Shao. 2019. “Model Parameter Optimization: ML-Guided Trans-Resolution Tuning of Physical Models.” In. Zenodo.
Pathak, Jaideep, Brian Hunt, Michelle Girvan, Zhixin Lu, and Edward Ott. 2018. Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach.” Physical Review Letters 120 (2): 024102.
Pathak, Jaideep, Zhixin Lu, Brian R. Hunt, Michelle Girvan, and Edward Ott. 2017. Using Machine Learning to Replicate Chaotic Attractors and Calculate Lyapunov Exponents from Data.” Chaos: An Interdisciplinary Journal of Nonlinear Science 27 (12): 121102.
Pestourie, Raphaël, Youssef Mroueh, Christopher Vincent Rackauckas, Payel Das, and Steven Glenn Johnson. 2021. Data-Efficient Training with Physics-Enhanced Deep Surrogates.” In.
Popov, Andrey Anatoliyevich. 2022. Combining Data-Driven and Theory-Guided Models in Ensemble Data Assimilation.” ETD. Virginia Tech.
Portwood, Gavin D, Peetak P Mitra, Mateus Dias Ribeiro, Tan Minh Nguyen, Balasubramanya T Nadiga, Juan A Saenz, Michael Chertkov, and Animesh Garg. 2019. “Turbulence Forecasting via Neural ODE.” In, 7.
Qian, Elizabeth, Boris Kramer, Benjamin Peherstorfer, and Karen Willcox. 2020. Lift & Learn: Physics-Informed Machine Learning for Large-Scale Nonlinear Dynamical Systems.” Physica D: Nonlinear Phenomena 406 (May): 132401.
Rackauckas, Chris, Alan Edelman, Keno Fischer, Mike Innes, Elliot Saba, Viral B Shah, and Will Tebbutt. 2020. Generalized Physics-Informed Learning Through Language-Wide Differentiable Programming.” MIT Web Domain, 6.
Rackauckas, Christopher. 2019. The Essential Tools of Scientific Machine Learning (Scientific ML).”
Raghu, Maithra, and Eric Schmidt. 2020. A Survey of Deep Learning for Scientific Discovery.” arXiv:2003.11755 [Cs, Stat], March.
Raissi, Maziar, P. Perdikaris, and George Em Karniadakis. 2019. Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations.” Journal of Computational Physics 378 (February): 686–707.
Raissi, Maziar, Alireza Yazdani, and George Em Karniadakis. 2020. Hidden Fluid Mechanics: Learning Velocity and Pressure Fields from Flow Visualizations.” Science 367 (6481): 1026–30.
Ramsundar, Bharath, Dilip Krishnamurthy, and Venkatasubramanian Viswanathan. 2021. Differentiable Physics: A Position Piece.” arXiv:2109.07573 [Physics], September.
Razavi, Saman. 2021. Deep Learning, Explained: Fundamentals, Explainability, and Bridgeability to Process-Based Modelling.” Environmental Modelling & Software 144 (October): 105159.
Razavi, Saman, Bryan A. Tolson, and Donald H. Burn. 2012. Review of Surrogate Modeling in Water Resources.” Water Resources Research 48 (7).
Rezende, Danilo J, Sébastien Racanière, Irina Higgins, and Peter Toth. 2019. “Equivariant Hamiltonian Flows.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS), 6.
Saemundsson, Steindor, Alexander Terenin, Katja Hofmann, and Marc Peter Deisenroth. 2020. Variational Integrator Networks for Physically Structured Embeddings.” arXiv:1910.09349 [Cs, Stat], March.
Sanchez-Gonzalez, Alvaro, Victor Bapst, Peter Battaglia, and Kyle Cranmer. 2019. “Hamiltonian Graph Networks with ODE Integrators.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS), 11.
Sanchez-Gonzalez, Alvaro, Jonathan Godwin, Tobias Pfaff, Rex Ying, Jure Leskovec, and Peter Battaglia. 2020. Learning to Simulate Complex Physics with Graph Networks.” In Proceedings of the 37th International Conference on Machine Learning, 8459–68. PMLR.
Sargsyan, Khachik, Bert Debusschere, Habib Najm, and Youssef Marzouk. 2009. Bayesian Inference of Spectral Expansions for Predictability Assessment in Stochastic Reaction Networks.” Journal of Computational and Theoretical Nanoscience 6 (10): 2283–97.
Sarkar, Soumalya, and Michael Joly. 2019. Multi-FIdelity Learning with Heterogeneous Domains.” In NeurIPS, 5.
Särkkä, Simo. 2011. Linear Operators and Stochastic Partial Differential Equations in Gaussian Process Regression.” In Artificial Neural Networks and Machine Learning – ICANN 2011, edited by Timo Honkela, Włodzisław Duch, Mark Girolami, and Samuel Kaski, 6792:151–58. Lecture Notes in Computer Science. Berlin, Heidelberg: Springer.
Siade, Adam J., Tao Cui, Robert N. Karelse, and Clive Hampton. 2020. Reduced‐Dimensional Gaussian Process Machine Learning for Groundwater Allocation Planning Using Swarm Theory.” Water Resources Research 56 (3).
Sun, Alexander Y., Hongkyu Yoon, Chung-Yan Shih, and Zhi Zhong. 2021. Applications of Physics-Informed Scientific Machine Learning in Subsurface Science: A Survey.” arXiv:2104.04764 [Physics], April.
Tait, Daniel J., and Theodoros Damoulas. 2020. Variational Autoencoding of PDE Inverse Problems.” arXiv:2006.15641 [Cs, Stat], June.
Tartakovsky, Alexandre M., Carlos Ortiz Marrero, Paris Perdikaris, Guzel D. Tartakovsky, and David Barajas-Solano. 2018. Learning Parameters and Constitutive Relationships with Physics Informed Deep Neural Networks,” August.
Thuerey, Nils, Philipp Holl, Maximilian Mueller, Patrick Schnell, Felix Trost, and Kiwon Um. 2021. Physics-Based Deep Learning. WWW.
Tompson, Jonathan, Kristofer Schlachter, Pablo Sprechmann, and Ken Perlin. 2017. Accelerating Eulerian Fluid Simulation with Convolutional Networks.” In Proceedings of the 34th International Conference on Machine Learning - Volume 70, 3424–33. ICML’17. Sydney, NSW, Australia: JMLR.org.
Wang, Sifan, Shyam Sankaran, and Paris Perdikaris. 2022. Respecting Causality Is All You Need for Training Physics-Informed Neural Networks.” arXiv.
Willard, Jared, Xiaowei Jia, Shaoming Xu, Michael Steinbach, and Vipin Kumar. n.d. “Integrating Scientific Knowledge with Machine Learning for Engineering and Environmental Systems” 1 (1): 35.
Witteveen, Jeroen A. S., and Hester Bijl. 2006. Modeling Arbitrary Uncertainties Using Gram-Schmidt Polynomial Chaos.” In 44th AIAA Aerospace Sciences Meeting and Exhibit. American Institute of Aeronautics and Astronautics.
Wu, Tailin, Takashi Maruyama, and Jure Leskovec. 2022. Learning to Accelerate Partial Differential Equations via Latent Global Evolution.” arXiv.
Yang, Liu, Dongkun Zhang, and George Em Karniadakis. 2020. Physics-Informed Generative Adversarial Networks for Stochastic Differential Equations.” SIAM Journal on Scientific Computing 42 (1): A292–317.
Yu, Xiayang, Tao Cui, J. Sreekanth, Stephane Mangeon, Rebecca Doble, Pei Xin, David Rassam, and Mat Gilfedder. 2020. Deep Learning Emulators for Groundwater Contaminant Transport Modelling.” Journal of Hydrology, August, 125351.
Zammit-Mangion, Andrew, and Christopher K. Wikle. 2020. Deep Integro-Difference Equation Models for Spatio-Temporal Forecasting.” Spatial Statistics 37 (June): 100408.
Zang, Yaohua, Gang Bao, Xiaojing Ye, and Haomin Zhou. 2020. Weak Adversarial Networks for High-Dimensional Partial Differential Equations.” Journal of Computational Physics 411 (June): 109409.
Zhang, Dongkun, Ling Guo, and George Em Karniadakis. 2020. Learning in Modal Space: Solving Time-Dependent Stochastic PDEs Using Physics-Informed Neural Networks.” SIAM Journal on Scientific Computing 42 (2): A639–65.
Zhang, Dongkun, Lu Lu, Ling Guo, and George Em Karniadakis. 2019. Quantifying Total Uncertainty in Physics-Informed Neural Networks for Solving Forward and Inverse Stochastic Problems.” Journal of Computational Physics 397 (November): 108850.
Zhu, Yinhao, Nicholas Zabaras, Phaedon-Stelios Koutsourelakis, and Paris Perdikaris. 2019. Physics-Constrained Deep Learning for High-Dimensional Surrogate Modeling and Uncertainty Quantification Without Labeled Data.” Journal of Computational Physics 394 (October): 56–81.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.