Bayesian inverse problems in function space

a.k.a. Bayesian calibration, model uncertainty for PDEs and other wibbly, blobby things

October 13, 2020 — September 26, 2022

functional analysis
linear algebra
probability
sparser than thou
spatial
statistics
Figure 1

Inverse problems where the unknown parameter is in some function space. For me, this usually implies a spatiotemporal model, usually in the context of PDE solvers, particularly approximate ones.

Suppose I have a PDE, possibly with some unknown parameters in the driving equation. I can do an adequate job of predicting the future behaviour of that system if I somehow know the governing equations, their parameters, and the current state. But what if I am missing some information? What if I wish to simultaneously infer some unknown inputs? Let us say, the starting state? This is the kind of problem that we refer to as an inverse problem. Inverse problems arise naturally in tomography, compressed sensing, deconvolution, inverting PDEs and many other areas.

The thing that is special about PDEs is that they have a spatial structure, much more structured than the low-dimensional inference problems that statisticians traditionally looked at, and so it is worth reasoning them through from first principles.

In particular, I would like to work through enough notation here that I can understand the various methods used to solve these inverse problems, for example, simulation-based inference, MCMC methods, GANs or variational inference.

Generally, I am interested in problems that use some kind of probabilistic network so that we can not just guess the solution but also do uncertainty quantification.

1 Discretisation

The first step is imagining how we can handle this complex problem in a finite computer. Lassas, Saksman, and Siltanen (2009) introduces a nice notation for this, which I use here. This connects the problem of inference to the problem of sampling theory, via the realisation that we need to discretize the solution in order to compute it.

I also wish to ransack their literature review:

The study of Bayesian inversion in infinite-dimensional function spaces was initiated by Franklin (1970) and continued by Mandelbaum (1984);Lehtinen, Paivarinta, and Somersalo (1989);Fitzpatrick (1991), and Luschgy (1996). The concept of discretization invariance was formulated by Markku Lehtinen in the 1990s and has been studied by D’Ambrogi, Mäenpää, and Markkanen (1999);Sari Lasanen (2002);S. Lasanen and Roininen (2005);Piiroinen (2005). A definition of discretization invariance similar to the above was given in Lassas and Siltanen (2004). For other kinds of discretization of continuum objects in the Bayesian framework, see Battle, Cunningham, and Hanson (1997);Niinimäki, Siltanen, and Kolehmainen (2007)… For regularization-based approaches for statistical inverse problems, see Bissantz, Hohage, and Munk (2004);Engl, Hofinger, and Kindermann (2005);Engl and Nashed (1981);Pikkarainen (2006). The relationship between continuous and discrete (non-statistical) inversion is studied in Hilbert spaces in Vogel (1984). See Borcea, Druskin, and Knizhnerman (2005) for specialized discretizations for inverse problems.

The insight is that there are two discretizations that are relevant, the discretization of the measurements and the discretization of the representation of a solution. We see naturally that we need to use one discretization, \(P_{k}\) to handle the finiteness of our measurements, and another, \(T_{n}\), to characterise the finite dimensionality of our solution.

Consider a quantity \(U\) observed via some indirect, noisy mapping \[ M=A U+\mathcal{E}, \] where \(A\) is an operator and \(\mathcal{E}\) is some mean-zero noise. We call this the continuum model. Here \(U\) and \(M\) are functions defined on subsets of \(\mathbb{R}^{d}\). We start by assuming \(A\) is a linear smoothing operator — think of convolution with some kernel. We intend to use Bayesian inversion to deduce information about \(U\) from measurement data concerning \(M\). We write these using random function notations: \(U(x, \omega), M(y, \omega)\) and \(\mathcal{E}(y, \omega)\) are random functions with \(\omega \in \Omega\) pulled from some probability space \((\Omega, \Sigma, \mathbb{P})\). \(x\) and \(y\) denote the function arguments, i.e. range over the Euclidean domains. These objects are all continuous; we explore the implications of discretising them.

Next, we introduce the practical measurement model, which is the first kind of discretisation. We assume that this measurement device provides us with a \(k\)-dimensional realization \[ M_{k}=P_{k} M=A_{k} U+\mathcal{E}_{k}, \] where \(A_{k}=P_{k} A\) and \(\mathcal{E}_{k}=P_{k} \mathcal{E}\). \(P_{k}\) is a linear operator describing the measurement process. Typically it will look something like \(P_{k} v=\sum_{j=1}^{k}\left\langle v, \phi_{j}\right\rangle \phi_{j}\) for some orthogonal basis \(\{\phi_{j}\}_j\). For simplicity, we take \(P_{k}\) to be a projection onto a \(k\)-sized orthogonal basis. Realized measurements are written \(m=M\left(\omega_{0}\right)\), for some \(\omega_{0} \in \Omega\). Projected measurement vectors are similarly written \(m_{k}=M_{k}\left(\omega_{0}\right)\).

In this notation, the inverse problem is: given a realization \(M_{k}\left(\omega_{0}\right)\), estimate the distribution of \(U\).

We cannot represent that distribution yet because \(U\) is a continuum object. So we introduce another discretization, via another projection operator \(T_n\) which maps \(U\) to a \(n\)-dimensional space, \(U_n:=T_n U\). This gives us the computational model, \[ M_{k n}=A_{k} U_{n}+\mathcal{E}_{k}. \]

Measurement \(M_{k n}\left(\omega_{0}\right)\) is related to the computational model but not to the practical measurement model. This is why we use \(m_{k}=M_{k}\left(\omega_{0}\right)\) as the given data.

I said we would understand model inversion problems in Bayesian terms. We manufacture some prior density \(\Pi_{n}\) over discretisations, \(U_{n}\).

Denote the probability density function of the random variable \(M_{k n}\) by \(\Upsilon_{k n}\left(m_{k n}\right)\). The posterior density for \(U_{n}\) is given by the Bayes formula: \[ \pi_{k n}\left(u_{n} \mid m_{k n}\right)=\frac{\Pi_{n}\left(u_{n}\right) \exp \left(-\frac{1}{2}\left\|m_{k n}-A_{k} u_{n}\right\|_{2}^{2}\right)}{\Upsilon_{k n}\left(m_{k n}\right)} \] where the exponential function corresponds to (4) with white noise statistics with identity variance, and a priori information about \(U\) is expressed in the form of a prior density \(\Pi_{n}\) for the random variable. We can now state the inverse problem more specifically: given a realization \(m_{k}=M_{k}\left(\omega_{0}\right)\), estimate \(U\) by \(\mathbf{u}_{k n}\), where the conditional mean (CM) estimate (or posterior mean estimate) \(\mathbf{u}_{k n}\) is \[ \mathbf{u}_{k n}:=\int_{Y_{n}} u_{n} \pi_{k n}\left(u_{n} \mid m_{k}\right) d u_{n} \]

TBC

2 Very nearly exact methods

For specific problems, there are specific methods, for example F. Sigrist, Künsch, and Stahel (2015b) and Liu, Yeo, and Lu (2020), for advection/diffusion equations.

3 Approximation of the posterior

Generic models are more tricky and we usually have to approximate _some_thing. See Bao et al. (2020);Jo et al. (2020);Lu et al. (2021);Raissi, Perdikaris, and Karniadakis (2019);Tait and Damoulas (2020);Xu and Darve (2020);Yang, Zhang, and Karniadakis (2020);D. Zhang, Guo, and Karniadakis (2020);D. Zhang et al. (2019).

4 Bayesian nonparametrics

Since this kind of problem naturally invites functional parameters, we can also imagine considering it in the context of Bayesian nonparametrics, which has a slightly different notation than you usually see in Bayes textbooks. I suspect that there is a useful role for diverse Bayesian nonparametrics here, especially non-smooth random measures, but the easiest of all is Gaussian process, which I handle next.

5 Gaussian process parameters

Alexanderian (2021) states a ‘well-known’ result, that the solution of a Bayesian linear inverse problem with Gaussian prior and noise models is a Gaussian posterior \(\mu_{\text {post }}^{y}=\mathcal{N}\left(m_{\text {MAP }}, \mathcal{C}_{\text {post }}\right)\), where \[ \mathcal{C}_{\text {post }}=\left(\mathcal{F}^{*} \boldsymbol{\Gamma}_{\text {noise }}^{-1} \mathcal{F}+\mathcal{C}_{\text {pr }}^{-1}\right)^{-1} \quad \text { and } \quad m_{\text {MAP }}=\mathcal{C}_{\text {post }}\left(\mathcal{F}^{*} \boldsymbol{\Gamma}_{\text {noise }}^{-1} \boldsymbol{y}+\mathcal{C}_{\text {pr }}^{-1} m_{\text {MAP }}\right). \]

Note the connection to Gaussian belief propagation.

6 Finite Element Models and belief propagation

Figure 2

Finite Element Models of PDEs (and possibly other representations? Orthogonal bases generally?) can be expressed through locally-linear relationships and thus analysed using Gaussian Belief Propagation (Y. El-Kurdi et al. 2016; Y. M. El-Kurdi 2014; Y. El-Kurdi et al. 2015). Note that in this setting, there is nothing special about the inversion process. Inference proceeds the same either forward or inversely, as a variational message passing algorithm.

7 Score-based generative models

a.k.a. neural diffusions etc. Powerful, probably a worthy default starting point for new work

8 Incoming

9 References

Alexanderian. 2021. Optimal Experimental Design for Infinite-Dimensional Bayesian Inverse Problems Governed by PDEs: A Review.” arXiv:2005.12998 [Math].
Alzraiee, White, Knowling, et al. 2022. A Scalable Model-Independent Iterative Data Assimilation Tool for Sequential and Batch Estimation of High Dimensional Model Parameters and States.” Environmental Modelling & Software.
Anderson. 1982. Reverse-Time Diffusion Equation Models.” Stochastic Processes and Their Applications.
Bao, Ye, Zang, et al. 2020. Numerical Solution of Inverse Problems by Weak Adversarial Networks.” Inverse Problems.
Bastek, Sun, and Kochmann. 2024. Physics-Informed Diffusion Models.”
Battle, Cunningham, and Hanson. 1997. 3D Tomographic Reconstruction Using Geometrical Models.” In Medical Imaging 1997: Image Processing.
Bissantz, Hohage, and Munk. 2004. Consistency and Rates of Convergence of Nonlinear Tikhonov Regularization with Random Noise.” Inverse Problems.
Borcea, Druskin, and Knizhnerman. 2005. On the Continuum Limit of a Discrete Inverse Spectral Problem on Optimal Finite Difference Grids.” Communications on Pure and Applied Mathematics.
Brehmer, Louppe, Pavez, et al. 2020. Mining Gold from Implicit Models to Improve Likelihood-Free Inference.” Proceedings of the National Academy of Sciences.
Bui-Thanh, Ghattas, Martin, et al. 2013. A Computational Framework for Infinite-Dimensional Bayesian Inverse Problems Part I: The Linearized Case, with Application to Global Seismic Inversion.” SIAM Journal on Scientific Computing.
Bui-Thanh, and Nguyen. 2016. FEM-Based Discretization-Invariant MCMC Methods for PDE-Constrained Bayesian Inverse Problems.” Inverse Problems & Imaging.
Bunker, Girolami, Lambley, et al. 2024. Autoencoders in Function Space.”
Chada, Iglesias, Roininen, et al. 2018. Parameterizations for Ensemble Kalman Inversion.” Inverse Problems.
Chen, and Oliver. 2013. Levenberg–Marquardt Forms of the Iterative Ensemble Smoother for Efficient History Matching and Uncertainty Quantification.” Computational Geosciences.
Chung, Kim, Mccann, et al. 2023. Diffusion Posterior Sampling for General Noisy Inverse Problems.” In.
Cotter, Dashti, and Stuart. 2010. Approximation of Bayesian Inverse Problems for PDEs.” SIAM Journal on Numerical Analysis.
Cox. 1993. An Analysis of Bayesian Inference for Nonparametric Regression.” The Annals of Statistics.
Cranmer, Brehmer, and Louppe. 2020. The Frontier of Simulation-Based Inference.” Proceedings of the National Academy of Sciences.
Cui, Tiangang, and Dolgov. 2022. Deep Composition of Tensor-Trains Using Squared Inverse Rosenblatt Transports.” Foundations of Computational Mathematics.
Cui, T., Martin, Marzouk, et al. 2014. Likelihood-Informed Dimension Reduction for Nonlinear Inverse Problems.” Inverse Problems.
Cui, Tiangang, Marzouk, and Willcox. 2016. Scalable Posterior Approximations for Large-Scale Bayesian Inverse Problems via Likelihood-Informed Parameter and State Reduction.” Journal of Computational Physics.
D’Ambrogi, Mäenpää, and Markkanen. 1999. Discretization Independent Retrieval of Atmospheric Ozone Profile.” Geophysica.
Dashti, Harris, and Stuart. 2011. Besov Priors for Bayesian Inverse Problems.”
Dashti, and Stuart. 2015. The Bayesian Approach To Inverse Problems.” arXiv:1302.6989 [Math].
Dubrule. 2018. Kriging, Splines, Conditional Simulation, Bayesian Inversion and Ensemble Kalman Filtering.” In Handbook of Mathematical Geosciences: Fifty Years of IAMG.
Dunbar, Duncan, Stuart, et al. 2022. Ensemble Inference Methods for Models With Noisy and Expensive Likelihoods.” SIAM Journal on Applied Dynamical Systems.
Dupont, Kim, Eslami, et al. 2022. From Data to Functa: Your Data Point Is a Function and You Can Treat It Like One.” In Proceedings of the 39th International Conference on Machine Learning.
El-Kurdi, Yousef Malek. 2014. “Parallel Finite Element Processing Using Gaussian Belief Propagation Inference on Probabilistic Graphical Models.”
El-Kurdi, Yousef, Dehnavi, Gross, et al. 2015. Parallel Finite Element Technique Using Gaussian Belief Propagation.” Computer Physics Communications.
El-Kurdi, Yousef, Fernandez, Gross, et al. 2016. Acceleration of the Finite-Element Gaussian Belief Propagation Solver Using Minimum Residual Techniques.” IEEE Transactions on Magnetics.
Engl, Hofinger, and Kindermann. 2005. Convergence Rates in the Prokhorov Metric for Assessing Uncertainty in Ill-Posed Problems.” Inverse Problems.
Engl, and Nashed. 1981. Generalized Inverses of Random Linear Operators in Banach Spaces.” Journal of Mathematical Analysis and Applications.
Fitzpatrick. 1991. Bayesian Analysis in Inverse Problems.” Inverse Problems.
Florens, and Simoni. 2016. Regularizing Priors for Linear Inverse Problems.” Econometric Theory.
Franklin. 1970. Well-Posed Stochastic Extensions of Ill-Posed Linear Problems.” Journal of Mathematical Analysis and Applications.
Gahungu, Lanyon, Álvarez, et al. 2022. Adjoint-Aided Inference of Gaussian Process Driven Differential Equations.” In.
Ghattas, and Willcox. 2021. Learning Physics-Based Models from Data: Perspectives from Inverse Problems and Model Reduction.” Acta Numerica.
Grigorievskiy, Lawrence, and Särkkä. 2017. Parallelizable Sparse Inverse Formulation Gaussian Processes (SpInGP).” In arXiv:1610.08035 [Stat].
Gupta, ed. 2021. Encyclopedia of Solid Earth Geophysics. Encyclopedia of Earth Sciences Series.
Guth, Schillings, and Weissmann. 2020. Ensemble Kalman Filter for Neural Network Based One-Shot Inversion.”
Holderrieth, Hutchinson, and Teh. 2021. Equivariant Learning of Stochastic Fields: Gaussian Processes and Steerable Conditional Neural Processes.” In Proceedings of the 38th International Conference on Machine Learning.
Holl, Koltun, and Thuerey. 2022. Scale-Invariant Learning by Physics Inversion.” In.
Huang, Schneider, and Stuart. 2022. Iterated Kalman Methodology for Inverse Problems.” Journal of Computational Physics.
Iglesias, Marco A. 2016. A Regularizing Iterative Ensemble Kalman Method for PDE-Constrained Inverse Problems.” Inverse Problems.
Iglesias, M. A., Law, and Stuart. 2012. MCMC for the Evaluation of Gaussian Approximations to Bayesian Inverse Problems in Groundwater Flow.” AIP Conference Proceedings.
Iglesias, Marco A., Law, and Stuart. 2013. Ensemble Kalman Methods for Inverse Problems.” Inverse Problems.
Iglesias, Marco A., Lin, Lu, et al. 2015. Filter Based Methods For Statistical Linear Inverse Problems.”
Jalal, Arvinte, Daras, et al. 2021. Robust Compressed Sensing MRI with Deep Generative Priors.” In Advances in Neural Information Processing Systems.
Jo, Son, Hwang, et al. 2020. Deep Neural Network Approach to Forward-Inverse Problems.” Networks & Heterogeneous Media.
Kaipio, and Somersalo. 2005. Statistical and Computational Inverse Problems. Applied Mathematical Sciences.
Kaipio, and Somersalo. 2007. Statistical Inverse Problems: Discretization, Model Reduction and Inverse Crimes.” Journal of Computational and Applied Mathematics.
Kennedy, and O’Hagan. 2001. Bayesian Calibration of Computer Models.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Knapik, van der Vaart, and van Zanten. 2011. Bayesian Inverse Problems with Gaussian Priors.” The Annals of Statistics.
Krämer, Bosch, Schmidt, et al. 2021. Probabilistic ODE Solutions in Millions of Dimensions.”
Lasanen, Sari. 2002. “Discretizations of Generalized Random Variables with Applications to Inverse Problems.”
———. 2012a. Non-Gaussian Statistical Inverse Problems. Part I: Posterior Distributions.” Inverse Problems and Imaging.
———. 2012b. Non-Gaussian Statistical Inverse Problems. Part II: Posterior Convergence for Approximated Unknowns.” Inverse Problems & Imaging.
Lasanen, S, and Roininen. 2005. “Statistical Inversion with Green’s Priors.” In Proceedings of the 5th International Conference on Inverse Problems in Engineering: Theory and Practice, Cambridge, UK.
Lassas, Saksman, and Siltanen. 2009. Discretization-Invariant Bayesian Inversion and Besov Space Priors.” Inverse Problems and Imaging.
Lassas, and Siltanen. 2004. Can One Use Total Variation Prior for Edge-Preserving Bayesian Inversion? Inverse Problems.
Lehtinen, Paivarinta, and Somersalo. 1989. Linear Inverse Problems for Generalised Random Variables.” Inverse Problems.
Liu, Yeo, and Lu. 2020. Statistical Modeling for Spatio-Temporal Data From Stochastic Convection-Diffusion Processes.” Journal of the American Statistical Association.
Lu, Jin, and Karniadakis. 2020. DeepONet: Learning Nonlinear Operators for Identifying Differential Equations Based on the Universal Approximation Theorem of Operators.” arXiv:1910.03193 [Cs, Stat].
Lu, Meng, Mao, et al. 2021. DeepXDE: A Deep Learning Library for Solving Differential Equations.” SIAM Review.
Luschgy. 1996. Linear Estimators and Radonifying Operators.” Theory of Probability & Its Applications.
Magnani, Krämer, Eschenhagen, et al. 2022. Approximate Bayesian Neural Operators: Uncertainty Quantification for Parametric PDEs.”
Mandelbaum. 1984. Linear Estimators and Measurable Linear Transformations on a Hilbert Space.” Zeitschrift Für Wahrscheinlichkeitstheorie Und Verwandte Gebiete.
Margossian, Vehtari, Simpson, et al. 2020. Hamiltonian Monte Carlo Using an Adjoint-Differentiated Laplace Approximation: Bayesian Inference for Latent Gaussian Models and Beyond.” arXiv:2004.12550 [Stat].
Matthies, Zander, Rosić, et al. 2016. Parameter Estimation via Conditional Expectation: A Bayesian Inversion.” Advanced Modeling and Simulation in Engineering Sciences.
McCoy. 1972. Higher-Order Moments of the Inverse of a Linear Stochastic Operator.” JOSA.
Mosegaard. 2011. Quest for Consistency, Symmetry, and Simplicity — The Legacy of Albert Tarantola.” GEOPHYSICS.
Mosegaard, and Tarantola. 1995. Monte Carlo Sampling of Solutions to Inverse Problems.” Journal of Geophysical Research: Solid Earth.
———. 2002. Probabilistic Approach to Inverse Problems.” In International Geophysics. International Handbook of Earthquake and Engineering Seismology, Part A.
Niinimäki, Siltanen, and Kolehmainen. 2007. Bayesian multiresolution method for local tomography in dental x-ray imaging.” Physics in Medicine and Biology.
O’Hagan. 2006. Bayesian Analysis of Computer Code Outputs: A Tutorial.” Reliability Engineering & System Safety, The Fourth International Conference on Sensitivity Analysis of Model Output (SAMO 2004),.
Oliver. 2022. Hybrid Iterative Ensemble Smoother for History Matching of Hierarchical Models.” Mathematical Geosciences.
Perdikaris, and Karniadakis. 2016. Model inversion via multi-fidelity Bayesian optimization: a new paradigm for parameter estimation in haemodynamics, and beyond.” Journal of the Royal Society, Interface.
Petra, Martin, Stadler, et al. 2014. A Computational Framework for Infinite-Dimensional Bayesian Inverse Problems, Part II: Stochastic Newton MCMC with Application to Ice Sheet Flow Inverse Problems.” SIAM Journal on Scientific Computing.
Phillips, Seror, Hutchinson, et al. 2022. Spectral Diffusion Processes.” In.
Pielok, Bischl, and Rügamer. 2023. Approximate Bayesian Inference with Stein Functional Variational Gradient Descent.” In.
Piiroinen. 2005. “Statistical Measurements, Experiments and Applications.”
Pikkarainen. 2006. State Estimation Approach to Nonstationary Inverse Problems: Discretization Error and Filtering Problem.” Inverse Problems.
Pinski, Simpson, Stuart, et al. 2015. Kullback-Leibler Approximation for Probability Measures on Infinite Dimensional Spaces.” SIAM Journal on Mathematical Analysis.
Plumlee. 2017. Bayesian Calibration of Inexact Computer Models.” Journal of the American Statistical Association.
Preston, and Poppeliers. 2021. LDRD #218329: Uncertainty Quantification of Geophysical Inversion Using Stochastic Partial Differential Equations. SAND2021-10885.
Raissi, Perdikaris, and Karniadakis. 2017a. Physics Informed Deep Learning (Part I): Data-Driven Solutions of Nonlinear Partial Differential Equations.”
———. 2017b. Physics Informed Deep Learning (Part II): Data-Driven Discovery of Nonlinear Partial Differential Equations.”
Raissi, Perdikaris, and Karniadakis. 2019. Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations.” Journal of Computational Physics.
Roininen, Huttunen, and Lasanen. 2014. Whittle-Matérn Priors for Bayesian Statistical Inversion with Applications in Electrical Impedance Tomography.” Inverse Problems & Imaging.
Roosta-Khorasani, Doel, and Ascher. 2014. Data Completion and Stochastic Algorithms for PDE Inversion Problems with Many Measurements.”
Rudner, Chen, Teh, et al. 2022. Tractable Function-Space Variational Inference in Bayesian Neural Networks.” In.
Sainsbury-Dale, Zammit-Mangion, and Huser. 2022. Fast Optimal Estimation with Intractable Models Using Permutation-Invariant Neural Networks.”
Sambridge, Jackson, and Valentine. 2022. Geophysical Inversion and Optimal Transport.” Geophysical Journal International.
Sambridge, and Mosegaard. 2002. Monte Carlo Methods in Geophysical Inverse Problems.” Reviews of Geophysics.
Särkkä, Solin, and Hartikainen. 2013. Spatiotemporal Learning via Infinite-Dimensional Bayesian Filtering and Smoothing: A Look at Gaussian Process Regression Through Kalman Filtering.” IEEE Signal Processing Magazine.
Schillings, and Stuart. 2017. Analysis of the Ensemble Kalman Filter for Inverse Problems.” SIAM Journal on Numerical Analysis.
Schneider, Stuart, and Wu. 2022. Ensemble Kalman Inversion for Sparse Learning of Dynamical Systems from Time-Averaged Data.” Journal of Computational Physics.
Sharrock, Simons, Liu, et al. 2022. Sequential Neural Score Estimation: Likelihood-Free Inference with Conditional Score Based Diffusion Models.”
Sigrist, Fabio Roman Albert. 2013. Physics Based Dynamic Modeling of Space-Time Data.” Application/pdf.
Sigrist, Fabio, Künsch, and Stahel. 2015a. Spate : An R Package for Spatio-Temporal Modeling with a Stochastic Advection-Diffusion Process.” Application/pdf. Journal of Statistical Software.
———. 2015b. Stochastic Partial Differential Equation Based Modelling of Large Space-Time Data Sets.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Song, Shen, Xing, et al. 2022. Solving Inverse Problems in Medical Imaging with Score-Based Generative Models.” In.
Song, Sohl-Dickstein, Kingma, et al. 2022. Score-Based Generative Modeling Through Stochastic Differential Equations.” In.
Spantini. 2017. On the low-dimensional structure of Bayesian inference.”
Stuart, Andrew M. 2010. Inverse Problems: A Bayesian Perspective.” Acta Numerica.
Stuart, Andrew M., and Teckentrup. 2016. Posterior Consistency for Gaussian Process Approximations of Bayesian Posterior Distributions.” arXiv:1603.02004 [Math].
Sun, Zhang, Shi, et al. 2019. Functional Variational Bayesian Neural Networks.” In.
Tait, and Damoulas. 2020. Variational Autoencoding of PDE Inverse Problems.” arXiv:2006.15641 [Cs, Stat].
Tarantola. 2005. Inverse Problem Theory and Methods for Model Parameter Estimation.
———. 2007. Mapping Of Probabilities.
Teckentrup. 2020. Convergence of Gaussian Process Regression with Estimated Hyper-Parameters and Applications in Bayesian Inverse Problems.” arXiv:1909.00232 [Cs, Math, Stat].
Valentine, Andrew P, and Sambridge. 2020a. Gaussian Process Models—I. A Framework for Probabilistic Continuous Inverse Theory.” Geophysical Journal International.
———. 2020b. Gaussian Process Models—II. Lessons for Discrete Inversion.” Geophysical Journal International.
Valentine, Andrew, and Sambridge. 2022. Emerging Directions in Geophysical Inversion.”
van Schuppen. 1989. Stochastic Realization Problems.” In Three Decades of Mathematical System Theory: A Collection of Surveys at the Occasion of the 50th Birthday of Jan C. Willems. Lecture Notes in Control and Information Sciences.
Vogel. 1984. Stochastic Inversion of Linear First Kind Integral Equations. II. Discrete Theory and Convergence Results.” Journal of Integral Equations.
Wang, Ren, Zhu, et al. 2018. Function Space Particle Optimization for Bayesian Neural Networks.” In.
Welter, Doherty, Hunt, et al. 2012. Approaches in Highly Parameterized Inversion: PEST++, a Parameter Estimation Code Optimized for Large Environmental Models.”
Welter, White, Hunt, et al. 2015. Approaches in Highly Parameterized Inversion—PEST++ Version 3, a Parameter ESTimation and Uncertainty Analysis Software Suite Optimized for Large Environmental Models.” USGS Numbered Series 7-C12. Techniques and Methods.
White. 2018. A Model-Independent Iterative Ensemble Smoother for Efficient History-Matching and Uncertainty Quantification in Very High Dimensions.” Environmental Modelling & Software.
White, Fienen, Barlow, et al. 2018. A Tool for Efficient, Model-Independent Management Optimization Under Uncertainty.” Environmental Modelling & Software.
White, Fienen, and Doherty. 2016a. pyEMU: A Python Framework for Environmental Model Uncertainty Analysis Version .01.”
———. 2016b. A Python Framework for Environmental Model Uncertainty Analysis.” Environmental Modelling & Software.
White, Hunt, Fienen, et al. 2020. Approaches to Highly Parameterized Inversion: PEST++ Version 5, a Software Suite for Parameter Estimation, Uncertainty Analysis, Management Optimization and Sensitivity Analysis.” USGS Numbered Series 7-C26. Techniques and Methods.
Wu, Maruyama, and Leskovec. 2022. Learning to Accelerate Partial Differential Equations via Latent Global Evolution.”
Xu, and Darve. 2019. Adversarial Numerical Analysis for Inverse Problems.”
———. 2020. ADCME: Learning Spatially-Varying Physical Fields Using Deep Neural Networks.” In arXiv:2011.11955 [Cs, Math].
Yang, Meng, and Karniadakis. 2021. B-PINNs: Bayesian Physics-Informed Neural Networks for Forward and Inverse PDE Problems with Noisy Data.” Journal of Computational Physics.
Yang, Zhang, and Karniadakis. 2020. Physics-Informed Generative Adversarial Networks for Stochastic Differential Equations.” SIAM Journal on Scientific Computing.
Zahm, Cui, Law, et al. 2022. Certified Dimension Reduction in Nonlinear Bayesian Inverse Problems.” Mathematics of Computation.
Zammit-Mangion, Bertolacci, Fisher, et al. 2021. WOMBAT v1.0: A fully Bayesian global flux-inversion framework.” Geoscientific Model Development Discussions.
Zhang, Xin, and Curtis. 2021. Bayesian Geophysical Inversion Using Invertible Neural Networks.” Journal of Geophysical Research: Solid Earth.
Zhang, Dongkun, Guo, and Karniadakis. 2020. Learning in Modal Space: Solving Time-Dependent Stochastic PDEs Using Physics-Informed Neural Networks.” SIAM Journal on Scientific Computing.
Zhang, Zhongqiang, and Karniadakis. 2017. Numerical Methods for Stochastic Partial Differential Equations with White Noise. Applied Mathematical Sciences.
Zhang, Dongkun, Lu, Guo, et al. 2019. Quantifying Total Uncertainty in Physics-Informed Neural Networks for Solving Forward and Inverse Stochastic Problems.” Journal of Computational Physics.