Neural learning for geoscience

September 16, 2020 — November 18, 2024

dynamical systems
machine learning
neural nets
sciml
SDEs
signal processing
spatial
stochastic processes
time series
spheres
Figure 1

Neural spatiotemporal prediction where the object is the planet, or at least a large part of it. This discipline is characterised by the importance of physics, and needs to work at a massive scale, applied to nearly-spherical objects, such as Earth.

1 Foundation models for

See geospatial foundation models.

2 Spherical data

Classical spatial problems might pretend that data always arrives in rectangular chunks, which is convenient, but does not work well for spheres (Bonev et al. 2023; Mahesh et al. 2024).

3 Tooling

Don’t write your own spatial data loaders. From personal experience, I can tell you that it is difficult to get right for a bunch of technical reasons. Best prosecuted by a transdisciplinary team. If you don’t have any geoscientists, it might not go too badly if you at least use libraries (HT Dan Steinberg for mentioning these).

4 References

Ayed, and de Bézenac. 2019. “Learning Dynamical Systems from Partial Observations.” In Advances In Neural Information Processing Systems.
Beck, E, and Jentzen. 2019. Machine Learning Approximation Algorithms for High-Dimensional Fully Nonlinear Partial Differential Equations and Second-Order Backward Stochastic Differential Equations.” Journal of Nonlinear Science.
Bodnar, Bruinsma, Lucic, et al. 2024. Aurora: A Foundation Model of the Atmosphere.”
Bonev, Kurth, Hundt, et al. 2023. Spherical Fourier Neural Operators: Learning Stable Dynamics on the Sphere.” In Proceedings of the 40th International Conference on Machine Learning. ICML’23.
Chen, Chong, Dou, Chen, et al. 2022. A Novel Neural Network Training Framework with Data Assimilation.” The Journal of Supercomputing.
Chen, Yuming, Sanz-Alonso, and Willett. 2022. Autodifferentiable Ensemble Kalman Filters.” SIAM Journal on Mathematics of Data Science.
———. 2023. Reduced-Order Autodifferentiable Ensemble Kalman Filters.” Inverse Problems.
Grohs, and Herrmann. 2022. Deep Neural Network Approximation for High-Dimensional Elliptic PDEs with Boundary Conditions.” IMA Journal of Numerical Analysis.
Guth, Mojahed, and Sapsis. 2023. Evaluation of Machine Learning Architectures on the Quantification Of Epistemic and Aleatoric Uncertainties In Complex Dynamical Systems.” SSRN Scholarly Paper.
Lam, Sanchez-Gonzalez, Willson, et al. 2023. GraphCast: Learning Skillful Medium-Range Global Weather Forecasting.”
Mahesh, Collins, Bonev, et al. 2024. Huge Ensembles Part I: Design of Ensemble Weather Forecasts Using Spherical Fourier Neural Operators.”
Park, Yoo, and Nadiga. 2019. “Machine Learning Climate Variability.” In.
Pathak, Subramanian, Harrington, et al. 2022. Fourcastnet: A Global Data-Driven High-Resolution Weather Model Using Adaptive Fourier Neural Operators.”
Pirinen, Mogren, and Västerdal. 2023. Fully Convolutional Networks for Dense Water Flow Intensity Prediction in Swedish Catchment Areas.”
Raissi, Yazdani, and Karniadakis. 2020. Hidden Fluid Mechanics: Learning Velocity and Pressure Fields from Flow Visualizations.” Science.
Safonova, Ghazaryan, Stiller, et al. 2023. Ten Deep Learning Techniques to Address Small Data Problems with Remote Sensing.” International Journal of Applied Earth Observation and Geoinformation.
Särkkä, and Hartikainen. 2012. Infinite-Dimensional Kalman Filtering Approach to Spatio-Temporal Gaussian Process Regression.” In Artificial Intelligence and Statistics.
Stewart, Robinson, Corley, et al. 2022. TorchGeo: Deep Learning with Geospatial Data.” In Proceedings of the 30th International Conference on Advances in Geographic Information Systems. Sigspatial ’22.
Yang, Zhang, and Karniadakis. 2020. Physics-Informed Generative Adversarial Networks for Stochastic Differential Equations.” SIAM Journal on Scientific Computing.
Zammit-Mangion, Ng, Vu, et al. 2021. Deep Compositional Spatial Models.” Journal of the American Statistical Association.