Uncertainty quantification

December 26, 2016 — July 6, 2021

Bayes
statistics
stochastic processes
surrogate
uncertainty

Using machine learning to make predictions, with a measure of the confidence of those predictions.

Figure 1

1 Taxonomy

Should clarify. TBD. Here is a recent reference on the theme: Kendall and Gal (2017) This extricates aleatoric and epistemic uncertainty. Also to mention, model uncertainty.

2 DUQ networks

van Amersfoort et al. (2020);Kendall and Gal (2017)

3 Bayes

Bayes methods have some ideas of uncertainty baked in. You can get some way with e.g., e.g. Gaussian process regression, or probabilistic NNs.

4 Physical model calibration

PEST, PEST++, and pyemu are some integrated systems for uncertainty quantification that use some weird terminology, such a FOSM (First-order-second-moment) models. I think these are best considered as inverse problem solvers, and the uncertainty quantification is a side effect of the inversion.

5 Conformal prediction

See conformal prediction.

6 Chaos expansions

See chaos expansions.

7 Uncertainty Quantification 360

IBM’s Uncertainty Quantification 360 toolkit a summary of popular generic methods:

  • Auxiliary Interval Predictor

Use an auxiliary model to improve the calibration of UQ generated by the original model.

  • Blackbox Metamodel Classification

Extract confidence scores from trained black-box classification models using a meta-model.

  • Blackbox Metamodel Regression

Extract prediction intervals from trained black-box regression models using a meta-model.

  • Classification Calibration

Post-hoc calibration of classification models using Isotonic Regression and Platt Scaling.

  • Heteroscedastic Regression

Train regression models that capture data uncertainty, assuming the targets are noisy and the amount of noise varies between data points.

  • Homoscedastic Gaussian Process Regression

Train Gaussian Process Regression models with homoscedastic noise that capture data and model uncertainty.

  • Horseshoe BNN classification

Train Bayesian neural networks classifiers with Gaussian and Horseshoe priors that capture data and model uncertainty.

  • Horseshoe BNN regression

Train BNNs regression models with Gaussian and Horseshoe priors that capture data and model uncertainty.

  • Infinitesimal Jackknife

Extract uncertainty from trained models by approximating the effect of training data perturbations on the model’s predictions.

  • Quantile Regression

Train Quantile Regression models that capture data uncertainty, by learning two separate models for the upper and lower quantile to obtain the prediction intervals.

  • UCC Recalibration

Recalibrate UQ of a regression model to specified operating point using Uncertainty Characteristics Curve

They provide guidance on method selection in the manual:

Figure 2: Source: UQ360

8 References

Alvarsson, Arvidsson McShane, Norinder, et al. 2021. Predicting With Confidence: Using Conformal Prediction in Drug Discovery.” Journal of Pharmaceutical Sciences.
Bhatt, Antorán, Zhang, et al. 2021. Uncertainty as a Form of Transparency: Measuring, Communicating, and Using Uncertainty.” arXiv:2011.07586 [Cs].
Bishop. 1994. Mixture Density Networks.” Microsoft Research.
Burrows, and Doherty. 2015. Efficient Calibration/Uncertainty Analysis Using Paired Complex/Surrogate Models.” Groundwater.
Chen, and Oliver. 2013. Levenberg–Marquardt Forms of the Iterative Ensemble Smoother for Efficient History Matching and Uncertainty Quantification.” Computational Geosciences.
Chipman, George, and Mcculloch. 2006. “Bayesian Ensemble Learning.” In.
Daxberger, Kristiadi, Immer, et al. 2021. Laplace Redux — Effortless Bayesian Deep Learning.” In arXiv:2106.14806 [Cs, Stat].
Doherty. 2015. Calibration and uncertainty analysis for complex environmental models.
Gal, and Ghahramani. 2015. Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning.” In Proceedings of the 33rd International Conference on Machine Learning (ICML-16).
———. 2016. Dropout as a Bayesian Approximation: Appendix.” arXiv:1506.02157 [Stat].
Ghosh, Liao, Ramamurthy, et al. 2021. Uncertainty Quantification 360: A Holistic Toolkit for Quantifying and Communicating the Uncertainty of AI.” arXiv:2106.01410 [Cs].
Gladish, Pagendam, Peeters, et al. 2018. Emulation Engines: Choice and Quantification of Uncertainty for Complex Hydrological Models.” Journal of Agricultural, Biological and Environmental Statistics.
Gratiet, Marelli, and Sudret. 2016. Metamodel-Based Sensitivity Analysis: Polynomial Chaos Expansions and Gaussian Processes.” In Handbook of Uncertainty Quantification.
Guth, Mojahed, and Sapsis. 2023. Evaluation of Machine Learning Architectures on the Quantification Of Epistemic and Aleatoric Uncertainties In Complex Dynamical Systems.” SSRN Scholarly Paper.
Higdon, Gattiker, Williams, et al. 2008. Computer Model Calibration Using High-Dimensional Output.” Journal of the American Statistical Association.
Hooten, Leeds, Fiechter, et al. 2011. Assessing First-Order Emulator Inference for Physical Parameters in Nonlinear Mechanistic Models.” Journal of Agricultural, Biological, and Environmental Statistics.
Jarvenpaa, Vehtari, and Marttinen. 2020. Batch Simulations and Uncertainty Quantification in Gaussian Process Surrogate Approximate Bayesian Computation.” In Conference on Uncertainty in Artificial Intelligence.
Kasim, Watson-Parris, Deaconu, et al. 2020. Up to Two Billion Times Acceleration of Scientific Simulations with Deep Neural Architecture Search.” arXiv:2001.08055 [Physics, Stat].
Kendall, and Gal. 2017. What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision?
Kingma, Salimans, and Welling. 2015. Variational Dropout and the Local Reparameterization Trick.” In Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 2. NIPS’15.
Kristiadi, Hein, and Hennig. 2021. Learnable Uncertainty Under Laplace Approximations.” In Uncertainty in Artificial Intelligence.
Lakshminarayanan, Pritzel, and Blundell. 2017. Simple and Scalable Predictive Uncertainty Estimation Using Deep Ensembles.” In Proceedings of the 31st International Conference on Neural Information Processing Systems. NIPS’17.
Minka. 2001. Expectation Propagation for Approximate Bayesian Inference.” In Proceedings of the Seventeenth Conference on Uncertainty in Artificial Intelligence. UAI’01.
Mukhoti, Kirsch, van Amersfoort, et al. 2021. Deterministic Neural Networks with Inductive Biases Capture Epistemic and Aleatoric Uncertainty.”
O’Hagan. 2013. “Polynomial Chaos: A Tutorial and Critique from a Statistician’s Perspective.”
Pestourie, Mroueh, Nguyen, et al. 2020. Active Learning of Deep Surrogates for PDEs: Application to Metasurface Design.” Npj Computational Materials.
Sacks, Schiller, and Welch. 1989. Designs for Computer Experiments.” Technometrics.
Sacks, Welch, Mitchell, et al. 1989. Design and Analysis of Computer Experiments.” Statistical Science.
Shafer, and Vovk. 2008. A Tutorial on Conformal Prediction.” Journal of Machine Learning Research.
Siade, Putti, and Yeh. 2010. Snapshot selection for groundwater model reduction using proper orthogonal decomposition.” Water Resources Research.
Smith. 2000. “Disentangling Uncertainty and Error: On the Predictability of Nonlinear Systems.” In Nonlinear Dynamics and Statistics.
Stuart. 2010. Inverse Problems: A Bayesian Perspective.” Acta Numerica.
Tenorio. 2017. An Introduction to Data Analysis and Uncertainty Quantification for Inverse Problems. Mathematics in Industry.
Tibshirani, Foygel Barber, Candes, et al. 2019. Conformal Prediction Under Covariate Shift.” In Advances in Neural Information Processing Systems.
Tonkin, and Doherty. 2009. Calibration-Constrained Monte Carlo Analysis of Highly Parameterized Models Using Subspace Techniques.” Water Resources Research.
van Amersfoort, Smith, Teh, et al. 2020. Uncertainty Estimation Using a Single Deep Deterministic Neural Network.” In International Conference on Machine Learning.
Ventola, Braun, Yu, et al. 2023. Probabilistic Circuits That Know What They Don’t Know.” arXiv.org.
Vovk, Gammerman, and Shafer. 2005. Algorithmic Learning in a Random World.
Welter, White, Hunt, et al. 2015. Approaches in Highly Parameterized Inversion—PEST++ Version 3, a Parameter ESTimation and Uncertainty Analysis Software Suite Optimized for Large Environmental Models.” USGS Numbered Series 7-C12. Techniques and Methods.
Wen, Tran, and Ba. 2020. BatchEnsemble: An Alternative Approach to Efficient Ensemble and Lifelong Learning.” In ICLR.
White. 2018. A Model-Independent Iterative Ensemble Smoother for Efficient History-Matching and Uncertainty Quantification in Very High Dimensions.” Environmental Modelling & Software.
White, Fienen, and Doherty. 2016a. pyEMU: A Python Framework for Environmental Model Uncertainty Analysis Version .01.”
———. 2016b. A Python Framework for Environmental Model Uncertainty Analysis.” Environmental Modelling & Software.
White, Hunt, Fienen, et al. 2020. Approaches to Highly Parameterized Inversion: PEST++ Version 5, a Software Suite for Parameter Estimation, Uncertainty Analysis, Management Optimization and Sensitivity Analysis.” USGS Numbered Series 7-C26. Techniques and Methods.
Zeni, Fontana, and Vantini. 2020. Conformal Prediction: A Unified Review of Theory and New Challenges.” arXiv:2005.07972 [Cs, Econ, Stat].
Zhang, Lu, Guo, et al. 2019. Quantifying Total Uncertainty in Physics-Informed Neural Networks for Solving Forward and Inverse Stochastic Problems.” Journal of Computational Physics.