Probabilistic numerics

July 13, 2023 — September 9, 2024

Bayes
calculus
compsci
dynamical systems
geometry
Hilbert space
how do science
machine learning
physics
regression
sciml
SDEs
signal processing
statistics
statmech
stochastic processes
surrogate
time series
uncertainty
Figure 1

Probabilistic Numerics claims:

Probabilistic numerics (PN) aims to quantify uncertainty arising from intractable or incomplete numerical computation and from stochastic input. This new paradigm treats a numerical problem as one of statistical inference instead. The probabilistic viewpoint provides a principled way to encode structural knowledge about a problem. By giving an explicit role to uncertainty from all sources, in particular from the computation itself, PN gives rise to new applications beyond the scope of classical methods.

Typical numerical tasks to which PN may be applied include optimization, integration, the solution of ordinary and partial differential equations, and the basic tasks of linear algebra, e.g. solution of linear systems and eigenvalue problems.

As well as offering an enriched reinterpretation of classical methods, the PN approach has several concrete practical points of value. The probabilistic interpretation of computation

  • allows building customized methods for specific problems with bespoke priors
  • formalizes the design of adaptive methods using tools from decision theory
  • provides a way of setting parameters of numerical methods via the Bayesian formalism
  • expedites the solution of mutually related problems of similar type
  • naturally incorporates sources of stochasticity in the computation
  • can give structural uncertainty via a probability measure compared to an error estimate

and finally it offers a principled approach to including numerical error in the propagation of uncertainty through chains of computations.

This is all terribly confusing if we only do keyword matching, since many methods are “probabilistic” in nature, in that we have noisy observations or estimates. The sense of probabilistic in this is to treat numerical methods, including that noise, as statistical inference problems, and to quantify the uncertainty in the results. So, one very popular and publishable trick is to create a probabilistic probabilistic algorithm. We can even treat, for example, Monte Carlo (i.e. probabilistic) integration as an algorithm with a prior distribution over the function to be integrated, and then update that distribution with the samples we draw.

1 Integration

Bayesian quadrature (Bach 2015; Huszár and Duvenaud 2012; O’Hagan 1991).

2 Optimization

(Wilkinson, Särkkä, and Solin 2023; Akyildiz, Elvira, and Miguez 2018; Hennig and Kiefel 2013; Mahsereci and Hennig 2015; Močkus 1975). This is probably the big success story of probabilistic numerics, and is actioned, e.g. in adaptive design of experiments.

3 Incoming

“Herding”: apparently it is secretly quadrature (Chai et al. 2019; Welling 2009; Huszár and Duvenaud 2012)?

Sounds a little like an ensemble Kalman method?

4 References

Akyildiz, Elvira, and Miguez. 2018. The Incremental Proximal Method: A Probabilistic Perspective.”
Bach. 2015. On the Equivalence Between Kernel Quadrature Rules and Random Feature Expansions.”
Chai, Ton, Garnett, et al. 2019. Automated Model Selection with Bayesian Quadrature.”
de Roos. 2022. Probabilistic Linear Algebra for Stochastic Optimization.”
Hennig, Ipsen, Mahsereci, et al. 2022. Probabilistic Numerical Methods - From Theory to Implementation (Dagstuhl Seminar 21432).” Edited by Philipp Hennig, Ilse C.F. Ipsen, Maren Mahsereci, and Tim Sullivan. Dagstuhl Reports.
Hennig, and Kiefel. 2013. Quasi-Newton Methods: A New Direction.” J. Mach. Learn. Res.
Hennig, Osborne, and Girolami. 2015. Probabilistic Numerics and Uncertainty in Computations.” Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences.
Hennig, Osborne, and Kersting. 2022. Probabilistic Numerics: Computation as Machine Learning.
Huszár, and Duvenaud. 2012. Optimally-Weighted Herding Is Bayesian Quadrature.” In.
Mahsereci, and Hennig. 2015. Probabilistic Line Searches for Stochastic Optimization.” In Advances in Neural Information Processing Systems.
Močkus. 1975. On Bayesian Methods for Seeking the Extremum.” In Optimization Techniques IFIP Technical Conference: Novosibirsk, July 1–7, 1974. Lecture Notes in Computer Science.
O’Hagan. 1991. Bayes–Hermite Quadrature.” Journal of Statistical Planning and Inference.
Rasmussen, and Ghahramani. 2002. Bayesian Monte Carlo.” In Proceedings of the 15th International Conference on Neural Information Processing Systems. NIPS’02.
Song, Zhang, Smola, et al. 2008. Tailoring Density Estimation via Reproducing Kernel Moment Matching.” In Proceedings of the 25th International Conference on Machine Learning. ICML ’08.
Welling. 2009. Herding Dynamical Weights to Learn.” In Proceedings of the 26th Annual International Conference on Machine Learning. ICML ’09.
Wenger. 2023. Probabilistic Numerical Linear Algebra for Machine Learning.”
Wilkinson, Särkkä, and Solin. 2023. Bayes-Newton Methods for Approximate Bayesian Inference with PSD Guarantees.” Journal of Machine Learning Research.
Wills, and Schön. 2017. On the Construction of Probabilistic Newton-Type Algorithms.” In 2017 IEEE 56th Annual Conference on Decision and Control (CDC).
———. 2021. Stochastic Quasi-Newton with Line-Search Regularisation.” Automatica.