# Uncertainty quantification

December 26, 2016 — July 6, 2021

Bayes
statistics
stochastic processes
surrogate
uncertainty

Using machine learning to make predictions, with a measure of the confidence of those predictions.

## 1 Taxonomy

Should clarify. TBD. Here is a recent reference on the theme: Kendall and Gal (2017) This extricates aleatoric and epistemic uncertainty. Also to mention, model uncertainty.

## 2 DUQ networks

van Amersfoort et al. (2020);Kendall and Gal (2017)

## 3 Bayes

Bayes methods have some ideas of uncertainty baked in. You can get some way with e.g., e.g. Gaussian process regression, or probabilistic NNs.

## 4 Physical model calibration

PEST, PEST++, and pyemu are some integrated systems for uncertainty quantification that use some weird terminology, such a FOSM (First-order-second-moment) models. I think these are best considered as inverse problem solvers, and the uncertainty quantification is a side effect of the inversion.

## 6 Chaos expansions

See chaos expansions.

## 7 Uncertainty Quantification 360

IBM’s Uncertainty Quantification 360 toolkit a summary of popular generic methods:

• Auxiliary Interval Predictor

Use an auxiliary model to improve the calibration of UQ generated by the original model.

• Blackbox Metamodel Classification

Extract confidence scores from trained black-box classification models using a meta-model.

• Blackbox Metamodel Regression

Extract prediction intervals from trained black-box regression models using a meta-model.

• Classification Calibration

Post-hoc calibration of classification models using Isotonic Regression and Platt Scaling.

• Heteroscedastic Regression

Train regression models that capture data uncertainty, assuming the targets are noisy and the amount of noise varies between data points.

• Homoscedastic Gaussian Process Regression

Train Gaussian Process Regression models with homoscedastic noise that capture data and model uncertainty.

• Horseshoe BNN classification

Train Bayesian neural networks classifiers with Gaussian and Horseshoe priors that capture data and model uncertainty.

• Horseshoe BNN regression

Train BNNs regression models with Gaussian and Horseshoe priors that capture data and model uncertainty.

• Infinitesimal Jackknife

Extract uncertainty from trained models by approximating the effect of training data perturbations on the model’s predictions.

• Quantile Regression

Train Quantile Regression models that capture data uncertainty, by learning two separate models for the upper and lower quantile to obtain the prediction intervals.

• UCC Recalibration

Recalibrate UQ of a regression model to specified operating point using Uncertainty Characteristics Curve

They provide guidance on method selection in the manual:

## 8 References

Alvarsson, Arvidsson McShane, Norinder, et al. 2021. Journal of Pharmaceutical Sciences.
Bhatt, Antorán, Zhang, et al. 2021. arXiv:2011.07586 [Cs].
Bishop. 1994. Microsoft Research.
Burrows, and Doherty. 2015. Groundwater.
Chen, and Oliver. 2013. Computational Geosciences.
Chipman, George, and Mcculloch. 2006. “Bayesian Ensemble Learning.” In.
Daxberger, Kristiadi, Immer, et al. 2021. In arXiv:2106.14806 [Cs, Stat].
Doherty. 2015. Calibration and uncertainty analysis for complex environmental models.
Gal, and Ghahramani. 2015. In Proceedings of the 33rd International Conference on Machine Learning (ICML-16).
———. 2016. arXiv:1506.02157 [Stat].
Ghosh, Liao, Ramamurthy, et al. 2021. arXiv:2106.01410 [Cs].
Gladish, Pagendam, Peeters, et al. 2018. Journal of Agricultural, Biological and Environmental Statistics.
Gratiet, Marelli, and Sudret. 2016. In Handbook of Uncertainty Quantification.
Guth, Mojahed, and Sapsis. 2023. SSRN Scholarly Paper.
Higdon, Gattiker, Williams, et al. 2008. Journal of the American Statistical Association.
Hooten, Leeds, Fiechter, et al. 2011. Journal of Agricultural, Biological, and Environmental Statistics.
Jarvenpaa, Vehtari, and Marttinen. 2020. In Conference on Uncertainty in Artificial Intelligence.
Kasim, Watson-Parris, Deaconu, et al. 2020. arXiv:2001.08055 [Physics, Stat].
Kendall, and Gal. 2017.
Kingma, Salimans, and Welling. 2015. In Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 2. NIPS’15.
Kristiadi, Hein, and Hennig. 2021. In Uncertainty in Artificial Intelligence.
Lakshminarayanan, Pritzel, and Blundell. 2017. In Proceedings of the 31st International Conference on Neural Information Processing Systems. NIPS’17.
Minka. 2001. In Proceedings of the Seventeenth Conference on Uncertainty in Artificial Intelligence. UAI’01.
Mukhoti, Kirsch, van Amersfoort, et al. 2021.
O’Hagan. 2013. “Polynomial Chaos: A Tutorial and Critique from a Statistician’s Perspective.”
Pestourie, Mroueh, Nguyen, et al. 2020. Npj Computational Materials.
Sacks, Schiller, and Welch. 1989. Technometrics.
Sacks, Welch, Mitchell, et al. 1989. Statistical Science.
Shafer, and Vovk. 2008. Journal of Machine Learning Research.
Siade, Putti, and Yeh. 2010. Water Resources Research.
Smith. 2000. “Disentangling Uncertainty and Error: On the Predictability of Nonlinear Systems.” In Nonlinear Dynamics and Statistics.
Stuart. 2010. Acta Numerica.
Tenorio. 2017. An Introduction to Data Analysis and Uncertainty Quantification for Inverse Problems. Mathematics in Industry.
Tibshirani, Foygel Barber, Candes, et al. 2019. In Advances in Neural Information Processing Systems.
Tonkin, and Doherty. 2009. Water Resources Research.
van Amersfoort, Smith, Teh, et al. 2020. In International Conference on Machine Learning.
Ventola, Braun, Yu, et al. 2023. arXiv.org.
Vovk, Gammerman, and Shafer. 2005. Algorithmic Learning in a Random World.
Welter, White, Hunt, et al. 2015. USGS Numbered Series 7-C12. Techniques and Methods.
Wen, Tran, and Ba. 2020. In ICLR.
White. 2018. Environmental Modelling & Software.
White, Fienen, and Doherty. 2016a.
———. 2016b. Environmental Modelling & Software.
White, Hunt, Fienen, et al. 2020. USGS Numbered Series 7-C26. Techniques and Methods.
Zeni, Fontana, and Vantini. 2020. arXiv:2005.07972 [Cs, Econ, Stat].
Zhang, Lu, Guo, et al. 2019. Journal of Computational Physics.