Emulation, a.k.a. surrogate modelling. In this context, it means reducing complicated physics-driven simulations to simpler/or faster ones using ML techniques. Especially popular in the ML for physics pipeline. I have mostly done this in the context of surrogate optimisation for experiments. See Neil Lawrence on Emulation for a modern overview.

A recent, hyped paper that exemplifies this approach is Kasim et al. (2020), which (somewhat implicitly) uses arguments from Dropout ensembling to produce quasi-Bayesian emulations of notoriously slow simulations.
Does it actually work?
And if it *does* well quantify posterior predictive uncertainty, can it estimate other posterior uncertainties?

Emukit (Paleyes et al. 2019) is a toolkit which generically wraps ML models for emulation purposes.

ML PDEs might be a particularly useful thing.

## Model order reduction

The traditional, and still useful approach is [reduced order modelling]./model_order_reduction.html), which has many useful related tricks.

## References

*Acta Numerica*30 (May): 1β86.

*Water Resources Research*51 (8): 5957β73.

*Journal of Hydrology*564 (September): 191β207.

*Progress in Aerospace Sciences*45 (1β3): 50β79.

*Acta Numerica*30 (May): 445β554.

*Journal of Agricultural, Biological and Environmental Statistics*23 (1): 39β62.

*Frontiers in Environmental Science*3 (April).

*Journal of the American Statistical Association*103 (482): 570β83.

*Frontiers in Applied Mathematics and Statistics*7.

*Journal of Agricultural, Biological, and Environmental Statistics*16 (4): 475β94.

*Conference on Uncertainty in Artificial Intelligence*, 779β88. PMLR.

*arXiv:2001.08055 [Physics, Stat]*, January.

*arXiv:1801.07337 [Physics]*, March.

*Computational Geosciences*23 (5): 1193β1215.

*Geoscientific Model Development*12 (5): 1791β1807.

*Neural Networks*, Computational Intelligence in Earth and Environmental Sciences, 20 (4): 462β78.

*Water Resources Research*53 (12): 10802β23.

*Journal of the Royal Statistical Society: Series B (Methodological)*40 (1): 1β24.

*Reliability Engineering & System Safety*, The Fourth International Conference on Sensitivity Analysis of Model Output (SAMO 2004), 91 (10): 1290β300.

*Technometrics*59 (1): 80β92.

*Advances In Neural Information Processing Systems*, 8.

*Journal of the American Statistical Association*112 (519): 1274β85.

*Progress in Aerospace Sciences*41 (1): 1β28.

*Water Resources Research*48 (7).

*Advances in Intelligent Data Analysis XVIII*, edited by Michael R. Berthold, Ad Feelders, and Georg Krempl, 12080:548β60. Cham: Springer International Publishing.

*Technometrics*31 (1): 41β47.

*Statistical Science*4 (4): 409β23.

*Third Workshop on Machine Learning and the Physical Sciences (NeurIPS 2020)*.

*Water Resources Research*56 (3).

*arXiv:2006.15641 [Cs, Stat]*, June.

*Hydrology and Earth System Sciences*24 (9): 4641β58.

*Nature Communications*11 (1): 5622.

*Proceedings of the 34th International Conference on Machine Learning - Volume 70*, 3424β33. ICMLβ17. Sydney, NSW, Australia: JMLR.org.

*Statistical Science*29 (1): 81β90.

*Statistical Science*31 (4): 465β89.

*Environmental Modelling & Software*85 (November): 217β28.

*Journal of Hydrology*, August, 125351.

*Journal of Computational Physics*394 (October): 56β81.

## No comments yet. Why not leave one?