Using machine learning to make predictions, with a measure of the confidence of those predictions.
Bayes
Bayes methods, e.g. Gaussian processes, have this baked in.
Physical model setting
PEST, PEST++, and pyemu are some integrated systems for uncertainty quantification that use some weird terminology, such a FOSM (First-order-second-moment) models, which at first glance resemble LIME-style model interpretations. The thing here is that these models have complicated physical dynamics which are hard to handle directly, but a surrogate model might be more tractable. There are some tricks to approximate model uncertainty using surrogates that I am just now learning of.
Conformal prediction
Predicting with competence: the best machine learning idea you never heard of:
The essential idea is that a “conformity function” exists. Effectively you are constructing a sort of multivariate cumulative distribution function for your machine learning gizmo using the conformity function. Such CDFs exist for classical stuff like ARIMA and linear regression under the correct circumstances; CP brings the idea to machine learning in general, and to models like ARIMA when the standard parametric confidence intervals won’t work. Within the framework, the conformity function, whatever may be, when used correctly can be guaranteed to give confidence intervals to within a probabilistic tolerance
Hmm