Bayesian inverse problems in function space

a.k.a. Bayesian calibration, model uncertainty for PDEs and other wibbly, blobby things



Inverse problems where the unknown parameter is in some function space. For me this usually implies a spatiotemporal model, usually in the context of PDE solvers, particularly approximate ones.

Suppose I have a PDE, possibly with some unknown parameters in the driving equation. I can do an adequate job of predicting the future behaviour of that system if I somehow know the governing equations, their parameters, and the current state. But what if I am missing some information? What if I wish to simultaneously infer some unknown inputs? Let us say, the starting state? This is the kind of problem that we refer to as an inverse problem. Inverse problems arise naturally in tomography, compressed sensing, deconvolution, inverting PDEs and many other areas.

The thing that is special about PDEs is that they have a spatial structure, much more than structured than the low-dimensional inference problems that statisticians traditionally looked at, and so it is worth reasoning them through from first principles.

In particular I would liked to work through enough notation here that I can understand the various methods used to solve these inverse problems, for example, simulation-based inference, MCMC methods, GANs or variational inference.

Generally, I am interested in problems that use some kind of probabilistic network so that we can not just guess the solution but also do uncertainty quantification.

Discretisation

First step is imagining how we can handle this complex problem in a finite computer. Lassas, Saksman, and Siltanen (2009) introduce a nice notation for this, which I use here. This connects the problem of inference to the problem of sampling theory, via the realisation that we need to discretize the solution in order to compute it.

I also wish to ransack their literature review:

The study of Bayesian inversion in infinite-dimensional function spaces was initiated by Franklin (1970) and continued by Mandelbaum (1984);Lehtinen, Paivarinta, and Somersalo (1989);Fitzpatrick (1991), and Luschgy (1996). The concept of discretization invariance was formulated by Markku Lehtinen in the 1990’s and has been studied by D’Ambrogi, MΓ€enpÀÀ, and Markkanen (1999); Sari Lasanen (2002); S. Lasanen and Roininen (2005);Piiroinen (2005). A definition of discretization invariance similar to the above was given in Lassas and Siltanen (2004). For other kinds of discretization of continuum objects in the Bayesian framework, see Battle, Cunningham, and Hanson (1997);NiinimΓ€ki, Siltanen, and Kolehmainen (2007)… For regularization based approaches for statistical inverse problems, see Bissantz, Hohage, and Munk (2004); Engl, Hofinger, and Kindermann (2005); Engl and Nashed (1981); Pikkarainen (2006). The relationship between continuous and discrete (non-statistical) inversion is studied in Hilbert spaces in Vogel (1984). See Borcea, Druskin, and Knizhnerman (2005) for specialized discretizations for inverse problems.

The insight is that there are two discretizations that are relevant, the discretization of the measurements and the discretization of the representation of a solution. We see naturally that we need to use one discretization, \(P_{k}\) to handle the finiteness of our measurements, and another, \(T_{n}\), to characterise the finite dimensionality of our solution.

Consider a quantity \(U\) observed via some indirect, noisy mapping \[ M=A U+\mathcal{E}, \] where \(A\) is an operator and \(\mathcal{E}\) is some mean-zero noise. We call this the continuum model. Here \(U\) and \(M\) are functions defined on subsets of \(\mathbb{R}^{d}\). We start by assuming \(A\) is linear smoothing operator - think of convolution with some kernel. We intend to use Bayesian inversion to deduce information about \(U\) from measurement data concerning \(M\). We write these using random function notations: \(U(x, \omega), M(y, \omega)\) and \(\mathcal{E}(y, \omega)\) are random functions with \(\omega \in \Omega\) pulled some probability space \((\Omega, \Sigma, \mathbb{P})\). \(x\) and \(y\) denote the function arguments, i.e. range over the Euclidean domains. These objects are all continuous; we explore the implications of discretising them.

Next we introduce the practical measurement model, which is the first kind of discretisation. We assume that this measurement device provides us with a \(k\)-dimensional realization \[ M_{k}=P_{k} M=A_{k} U+\mathcal{E}_{k}, \] where \(A_{k}=P_{k} A\) and \(\mathcal{E}_{k}=P_{k} \mathcal{E}\). \(P_{k}\) is a linear operator describing the measurement process. Typically it will look something like \(P_{k} v=\sum_{j=1}^{k}\left\langle v, \phi_{j}\right\rangle \phi_{j}\) for some orthogonal basis \(\{\phi_{j}\}_j\). For simplicity we take \(P_{k}\) to be a projection onto a \(k\)-sized orthogonal basis. Realized measurements are written \(m=M\left(\omega_{0}\right)\), for some \(\omega_{0} \in \Omega\). Projected meaurement vectors are similarly written \(m_{k}=M_{k}\left(\omega_{0}\right)\).

In this notation, the inverse problem is: given a realization \(M_{k}\left(\omega_{0}\right)\), estimate the distribution of \(U\).

We cannot represent that distribution yet because \(U\) is a continuum object. So we introduce another discretization, via another projection operator \(T_n\) which maps \(U\) to a \(n\)-dimensional space, \(U_n:=T_n U\). This gives us the computational model, \[ M_{k n}=A_{k} U_{n}+\mathcal{E}_{k}. \]

I said we would use this to understand this in Bayesian terms. We manufacture some prior density \(\Pi_{n}\) over discretisations, \(U_{n}\).

TBC

Very nearly exact methods

For specific problems there are specific methods, for example F. Sigrist, KΓΌnsch, and Stahel (2015b) and Liu, Yeo, and Lu (2020), for advection/diffusion equations.

Approximation of the posterior

Generic models are more tricky and we usually have to approximation _some_thing. See Bao et al. (2020); Jo et al. (2020); Lu et al. (2021); Raissi, Perdikaris, and Karniadakis (2019); Tait and Damoulas (2020); Xu and Darve (2020); Yang, Zhang, and Karniadakis (2020); D. Zhang, Guo, and Karniadakis (2020); D. Zhang et al. (2019).

Bayesian nonparametrics

Since this kind of problem naturally invites functional parameters, we can also imagine considering this in the context of Bayesian nonparametrics, which has a slightly different notation than you usually see in Bayes textbooks. I suspect that there is a useful role for various Bayesian nonparametrics here, but the easiest of all is Gaussian process, which I handle next.

Gaussian process parameters

Alexanderian (2021) states a β€˜well-known’ result, that the solution of a Bayesian linear inverse problem with Gaussian prior and noise models is a Gaussian posterior \(\mu_{\text {post }}^{y}=\mathcal{N}\left(m_{\text {MAP }}, \mathcal{C}_{\text {post }}\right)\), where \[ \mathcal{C}_{\text {post }}=\left(\mathcal{F}^{*} \boldsymbol{\Gamma}_{\text {noise }}^{-1} \mathcal{F}+\mathcal{C}_{\text {pr }}^{-1}\right)^{-1} \quad \text { and } \quad m_{\text {MAP }}=\mathcal{C}_{\text {post }}\left(\mathcal{F}^{*} \boldsymbol{\Gamma}_{\text {noise }}^{-1} \boldsymbol{y}+\mathcal{C}_{\text {pr }}^{-1} m_{\text {MAP }}\right). \]

Note the connection to Gaussian belief propagation.

Finite Element Models and belief propagation

Finite Element Models of PDEs of PDEs (and possibly other representations? Orthogonal bases generally?) can be expressed through locally-linear relationships and thus analysed using Gaussian Belief Propagation (Y. El-Kurdi et al. 2016; Y. M. El-Kurdi 2014; Y. El-Kurdi et al. 2015). Note that in this setting, there is nothing special about the inversion process. Inference proceeds the same either forward or inversely, as a variational message passing algorithm.

References

Alexanderian, Alen. 2021. β€œOptimal Experimental Design for Infinite-Dimensional Bayesian Inverse Problems Governed by PDEs: A Review.” arXiv:2005.12998 [Math], January.
Anderson, Brian D. O. 1982. β€œReverse-Time Diffusion Equation Models.” Stochastic Processes and Their Applications 12 (3): 313–26.
Bao, Gang, Xiaojing Ye, Yaohua Zang, and Haomin Zhou. 2020. β€œNumerical Solution of Inverse Problems by Weak Adversarial Networks.” Inverse Problems 36 (11): 115003.
Battle, Xavier L., Gregory S. Cunningham, and Kenneth M. Hanson. 1997. β€œ3D Tomographic Reconstruction Using Geometrical Models.” In Medical Imaging 1997: Image Processing, 3034:346–57. SPIE.
Bissantz, Nicolai, Thorsten Hohage, and Axel Munk. 2004. β€œConsistency and Rates of Convergence of Nonlinear Tikhonov Regularization with Random Noise.” Inverse Problems 20 (6): 1773–89.
Borcea, Liliana, Vladimir Druskin, and Leonid Knizhnerman. 2005. β€œOn the Continuum Limit of a Discrete Inverse Spectral Problem on Optimal Finite Difference Grids.” Communications on Pure and Applied Mathematics 58 (9): 1231–79.
Brehmer, Johann, Gilles Louppe, Juan Pavez, and Kyle Cranmer. 2020. β€œMining Gold from Implicit Models to Improve Likelihood-Free Inference.” Proceedings of the National Academy of Sciences 117 (10): 5242–49.
Bui-Thanh, Tan, Omar Ghattas, James Martin, and Georg Stadler. 2013. β€œA Computational Framework for Infinite-Dimensional Bayesian Inverse Problems Part I: The Linearized Case, with Application to Global Seismic Inversion.” SIAM Journal on Scientific Computing 35 (6): A2494–2523.
Bui-Thanh, Tan, and Quoc P. Nguyen. 2016. β€œFEM-Based Discretization-Invariant MCMC Methods for PDE-Constrained Bayesian Inverse Problems.” Inverse Problems & Imaging 10 (4): 943.
Cotter, S. L., M. Dashti, and A. M. Stuart. 2010. β€œApproximation of Bayesian Inverse Problems for PDEs.” SIAM Journal on Numerical Analysis 48 (1): 322–45.
Cox, Dennis D. 1993. β€œAn Analysis of Bayesian Inference for Nonparametric Regression.” The Annals of Statistics 21 (2): 903–23.
Cranmer, Kyle, Johann Brehmer, and Gilles Louppe. 2020. β€œThe Frontier of Simulation-Based Inference.” Proceedings of the National Academy of Sciences, May.
D’Ambrogi, Barbara, Sari MΓ€enpÀÀ, and Markku Markkanen. 1999. β€œDiscretization Independent Retrieval of Atmospheric Ozone Profile.” Geophysica 35 (1-2): 87–99.
Dashti, Masoumeh, Stephen Harris, and Andrew Stuart. 2011. β€œBesov Priors for Bayesian Inverse Problems.” arXiv.
Dashti, Masoumeh, and Andrew M. Stuart. 2015. β€œThe Bayesian Approach To Inverse Problems.” arXiv:1302.6989 [Math], July.
Dubrule, Olivier. 2018. β€œKriging, Splines, Conditional Simulation, Bayesian Inversion and Ensemble Kalman Filtering.” In Handbook of Mathematical Geosciences: Fifty Years of IAMG, edited by B.S. Daya Sagar, Qiuming Cheng, and Frits Agterberg, 3–24. Cham: Springer International Publishing.
Dupont, Emilien, Hyunjik Kim, S. M. Ali Eslami, Danilo Rezende, and Dan Rosenbaum. 2022. β€œFrom Data to Functa: Your Data Point Is a Function and You Can Treat It Like One.” arXiv.
El-Kurdi, Yousef Malek. 2014. β€œParallel Finite Element Processing Using Gaussian Belief Propagation Inference on Probabilistic Graphical Models.” PhD Thesis, McGill University.
El-Kurdi, Yousef, Maryam Mehri Dehnavi, Warren J. Gross, and Dennis Giannacopoulos. 2015. β€œParallel Finite Element Technique Using Gaussian Belief Propagation.” Computer Physics Communications 193 (August): 38–48.
El-Kurdi, Yousef, David Fernandez, Warren J. Gross, and Dennis D. Giannacopoulos. 2016. β€œAcceleration of the Finite-Element Gaussian Belief Propagation Solver Using Minimum Residual Techniques.” IEEE Transactions on Magnetics 52 (3): 1–4.
Engl, Heinz W., Andreas Hofinger, and Stefan Kindermann. 2005. β€œConvergence Rates in the Prokhorov Metric for Assessing Uncertainty in Ill-Posed Problems.” Inverse Problems 21 (1): 399–412.
Engl, Heinz W., and M. Zuhair Nashed. 1981. β€œGeneralized Inverses of Random Linear Operators in Banach Spaces.” Journal of Mathematical Analysis and Applications 83 (2): 582–610.
Fitzpatrick, B. G. 1991. β€œBayesian Analysis in Inverse Problems.” Inverse Problems 7 (5): 675–702.
Florens, Jean-Pierre, and Anna Simoni. 2016. β€œRegularizing Priors for Linear Inverse Problems.” Econometric Theory 32 (1): 71–121.
Franklin, Joel N. 1970. β€œWell-Posed Stochastic Extensions of Ill-Posed Linear Problems.” Journal of Mathematical Analysis and Applications 31 (3): 682–716.
Ghattas, Omar, and Karen Willcox. 2021. β€œLearning Physics-Based Models from Data: Perspectives from Inverse Problems and Model Reduction.” Acta Numerica 30 (May): 445–554.
Grigorievskiy, Alexander, Neil Lawrence, and Simo SΓ€rkkΓ€. 2017. β€œParallelizable Sparse Inverse Formulation Gaussian Processes (SpInGP).” In arXiv:1610.08035 [Stat].
Guth, Philipp A., Claudia Schillings, and Simon Weissmann. 2020. β€œEnsemble Kalman Filter for Neural Network Based One-Shot Inversion.” arXiv.
Jalal, Ajil, Marius Arvinte, Giannis Daras, Eric Price, Alexandros G Dimakis, and Jon Tamir. 2021. β€œRobust Compressed Sensing MRI with Deep Generative Priors.” In Advances in Neural Information Processing Systems, 34:14938–54. Curran Associates, Inc.
Jo, Hyeontae, Hwijae Son, Hyung Ju Hwang, and Eun Heui Kim. 2020. β€œDeep Neural Network Approach to Forward-Inverse Problems.” Networks & Heterogeneous Media 15 (2): 247.
Kaipio, Jari, and E. Somersalo. 2005. Statistical and Computational Inverse Problems. Applied Mathematical Sciences. New York: Springer-Verlag.
Kaipio, Jari, and Erkki Somersalo. 2007. β€œStatistical Inverse Problems: Discretization, Model Reduction and Inverse Crimes.” Journal of Computational and Applied Mathematics 198 (2): 493–504.
Kennedy, Marc C., and Anthony O’Hagan. 2001. β€œBayesian Calibration of Computer Models.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 63 (3): 425–64.
Knapik, B. T., A. W. van der Vaart, and J. H. van Zanten. 2011. β€œBayesian Inverse Problems with Gaussian Priors.” The Annals of Statistics 39 (5).
KrΓ€mer, Nicholas, Nathanael Bosch, Jonathan Schmidt, and Philipp Hennig. 2021. β€œProbabilistic ODE Solutions in Millions of Dimensions.” arXiv.
Lasanen, Sari. 2002. β€œDiscretizations of Generalized Random Variables with Applications to Inverse Problems.”
β€”β€”β€”. 2012a. β€œNon-Gaussian Statistical Inverse Problems. Part I: Posterior Distributions.” Inverse Problems and Imaging 6 (2): 215.
β€”β€”β€”. 2012b. β€œNon-Gaussian Statistical Inverse Problems. Part II: Posterior Convergence for Approximated Unknowns.” Inverse Problems & Imaging 6 (2): 267.
Lasanen, S, and L Roininen. 2005. β€œStatistical Inversion with Green’s Priors.” In Proceedings of the 5th International Conference on Inverse Problems in Engineering: Theory and Practice, Cambridge, UK, 11.
Lassas, Matti, Eero Saksman, and Samuli Siltanen. 2009. β€œDiscretization-Invariant Bayesian Inversion and Besov Space Priors.” Inverse Problems and Imaging 3 (1): 87–122.
Lassas, Matti, and Samuli Siltanen. 2004. β€œCan One Use Total Variation Prior for Edge-Preserving Bayesian Inversion?” Inverse Problems 20 (5): 1537–63.
Lehtinen, M. S., L. Paivarinta, and E. Somersalo. 1989. β€œLinear Inverse Problems for Generalised Random Variables.” Inverse Problems 5 (4): 599–612.
Liu, Xiao, Kyongmin Yeo, and Siyuan Lu. 2020. β€œStatistical Modeling for Spatio-Temporal Data From Stochastic Convection-Diffusion Processes.” Journal of the American Statistical Association 0 (0): 1–18.
Lu, Lu, Pengzhan Jin, and George Em Karniadakis. 2020. β€œDeepONet: Learning Nonlinear Operators for Identifying Differential Equations Based on the Universal Approximation Theorem of Operators.” arXiv:1910.03193 [Cs, Stat], April.
Lu, Lu, Xuhui Meng, Zhiping Mao, and George Em Karniadakis. 2021. β€œDeepXDE: A Deep Learning Library for Solving Differential Equations.” SIAM Review 63 (1): 208–28.
Luschgy, H. 1996. β€œLinear Estimators and Radonifying Operators.” Theory of Probability & Its Applications 40 (1): 167–75.
Magnani, Emilia, Nicholas KrΓ€mer, Runa Eschenhagen, Lorenzo Rosasco, and Philipp Hennig. 2022. β€œApproximate Bayesian Neural Operators: Uncertainty Quantification for Parametric PDEs.” arXiv.
Mandelbaum, Avi. 1984. β€œLinear Estimators and Measurable Linear Transformations on a Hilbert Space.” Zeitschrift FΓΌr Wahrscheinlichkeitstheorie Und Verwandte Gebiete 65 (3): 385–97.
Margossian, Charles C., Aki Vehtari, Daniel Simpson, and Raj Agrawal. 2020. β€œHamiltonian Monte Carlo Using an Adjoint-Differentiated Laplace Approximation: Bayesian Inference for Latent Gaussian Models and Beyond.” arXiv:2004.12550 [Stat], October.
Mosegaard, Klaus, and Albert Tarantola. 1995. β€œMonte Carlo Sampling of Solutions to Inverse Problems.” Journal of Geophysical Research: Solid Earth 100 (B7): 12431–47.
β€”β€”β€”. 2002. β€œProbabilistic Approach to Inverse Problems.” In International Geophysics, 81:237–65. Elsevier.
NiinimΓ€ki, K., S. Siltanen, and V. Kolehmainen. 2007. β€œBayesian multiresolution method for local tomography in dental x-ray imaging.” Physics in Medicine and Biology 52 (22): 6663–78.
O’Hagan, A. 2006. β€œBayesian Analysis of Computer Code Outputs: A Tutorial.” Reliability Engineering & System Safety, The Fourth International Conference on Sensitivity Analysis of Model Output (SAMO 2004), 91 (10): 1290–300.
Perdikaris, Paris, and George Em Karniadakis. 2016. β€œModel inversion via multi-fidelity Bayesian optimization: a new paradigm for parameter estimation in haemodynamics, and beyond.” Journal of the Royal Society, Interface 13 (118): 20151107.
Petra, Noemi, James Martin, Georg Stadler, and Omar Ghattas. 2014. β€œA Computational Framework for Infinite-Dimensional Bayesian Inverse Problems, Part II: Stochastic Newton MCMC with Application to Ice Sheet Flow Inverse Problems.” SIAM Journal on Scientific Computing 36 (4): A1525–55.
Piiroinen, Petteri. 2005. β€œStatistical Measurements, Experiments and Applications.” Doctoral Thesis, Helsinki: Suomalainen Tiedeakatemia.
Pikkarainen, Hanna Katriina. 2006. β€œState Estimation Approach to Nonstationary Inverse Problems: Discretization Error and Filtering Problem.” Inverse Problems 22 (1): 365–79.
Pinski, F. J., G. Simpson, A. M. Stuart, and H. Weber. 2015. β€œKullback-Leibler Approximation for Probability Measures on Infinite Dimensional Spaces.” SIAM Journal on Mathematical Analysis 47 (6): 4091–4122.
Plumlee, Matthew. 2017. β€œBayesian Calibration of Inexact Computer Models.” Journal of the American Statistical Association 112 (519): 1274–85.
Preston, Leiph, and Christian Poppeliers. 2021. β€œLDRD #218329: Uncertainty Quantification of Geophysical Inversion Using Stochastic Partial Differential Equations.” SAND2021-10885. Sandia National Lab. (SNL-NM), Albuquerque, NM (United States).
Raissi, Maziar, Paris Perdikaris, and George Em Karniadakis. 2017a. β€œPhysics Informed Deep Learning (Part I): Data-Driven Solutions of Nonlinear Partial Differential Equations,” November.
β€”β€”β€”. 2017b. β€œPhysics Informed Deep Learning (Part II): Data-Driven Discovery of Nonlinear Partial Differential Equations,” November.
Raissi, Maziar, P. Perdikaris, and George Em Karniadakis. 2019. β€œPhysics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations.” Journal of Computational Physics 378 (February): 686–707.
Roininen, Lassi, Janne M. J. Huttunen, and Sari Lasanen. 2014. β€œWhittle-MatΓ©rn Priors for Bayesian Statistical Inversion with Applications in Electrical Impedance Tomography.” Inverse Problems & Imaging 8 (2): 561.
Roosta-Khorasani, Farbod, Kees van den Doel, and Uri Ascher. 2014. β€œData Completion and Stochastic Algorithms for PDE Inversion Problems with Many Measurements.” arXiv.
Sambridge, Malcolm, and Klaus Mosegaard. 2002. β€œMonte Carlo Methods in Geophysical Inverse Problems.” Reviews of Geophysics 40 (3): 3-1-3-29.
SΓ€rkkΓ€, Simo, A. Solin, and J. Hartikainen. 2013. β€œSpatiotemporal Learning via Infinite-Dimensional Bayesian Filtering and Smoothing: A Look at Gaussian Process Regression Through Kalman Filtering.” IEEE Signal Processing Magazine 30 (4): 51–61.
Schillings, Claudia, and Andrew M. Stuart. 2017. β€œAnalysis of the Ensemble Kalman Filter for Inverse Problems.” SIAM Journal on Numerical Analysis 55 (3): 1264–90.
Sigrist, Fabio Roman Albert. 2013. β€œPhysics Based Dynamic Modeling of Space-Time Data.” Application/pdf. ETH Zurich.
Sigrist, Fabio, Hans R. KΓΌnsch, and Werner A. Stahel. 2015a. β€œSpate : An R Package for Spatio-Temporal Modeling with a Stochastic Advection-Diffusion Process.” Application/pdf. Journal of Statistical Software 63 (14).
β€”β€”β€”. 2015b. β€œStochastic Partial Differential Equation Based Modelling of Large Space-Time Data Sets.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 77 (1): 3–33.
Song, Yang, Liyue Shen, Lei Xing, and Stefano Ermon. 2022. β€œSolving Inverse Problems in Medical Imaging with Score-Based Generative Models.” In. arXiv.
Song, Yang, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, and Ben Poole. 2022. β€œScore-Based Generative Modeling Through Stochastic Differential Equations.” In.
Stuart, A. M. 2010. β€œInverse Problems: A Bayesian Perspective.” Acta Numerica 19: 451–559.
Stuart, Andrew M., and Aretha L. Teckentrup. 2016. β€œPosterior Consistency for Gaussian Process Approximations of Bayesian Posterior Distributions.” arXiv:1603.02004 [Math], December.
Sun, Shengyang, Guodong Zhang, Jiaxin Shi, and Roger Grosse. 2019. β€œFunctional Variational Bayesian Neural Networks.” In.
Tait, Daniel J., and Theodoros Damoulas. 2020. β€œVariational Autoencoding of PDE Inverse Problems.” arXiv:2006.15641 [Cs, Stat], June.
Tarantola, Albert. 2005. Inverse Problem Theory and Methods for Model Parameter Estimation. SIAM.
β€”β€”β€”. n.d. Mapping Of Probabilities.
Teckentrup, Aretha L. 2020. β€œConvergence of Gaussian Process Regression with Estimated Hyper-Parameters and Applications in Bayesian Inverse Problems.” arXiv:1909.00232 [Cs, Math, Stat], July.
Vogel, C. R. 1984. β€œStochastic Inversion of Linear First Kind Integral Equations. II. Discrete Theory and Convergence Results.” Journal of Integral Equations 7 (1): 73–92.
Welter, David E., Jeremy T. White, Randall J. Hunt, and John E. Doherty. 2015. β€œApproaches in Highly Parameterized Inversionβ€”PEST++ Version 3, a Parameter ESTimation and Uncertainty Analysis Software Suite Optimized for Large Environmental Models.” USGS Numbered Series 7-C12. Techniques and Methods. Reston, VA: U.S. Geological Survey.
White, Jeremy T., Michael N. Fienen, and John E. Doherty. 2016a. β€œpyEMU: A Python Framework for Environmental Model Uncertainty Analysis Version .01.” U.S. Geological Survey.
β€”β€”β€”. 2016b. β€œA Python Framework for Environmental Model Uncertainty Analysis.” Environmental Modelling & Software 85 (November): 217–28.
Xu, Kailai, and Eric Darve. 2019. β€œAdversarial Numerical Analysis for Inverse Problems.” arXiv.
β€”β€”β€”. 2020. β€œADCME: Learning Spatially-Varying Physical Fields Using Deep Neural Networks.” In arXiv:2011.11955 [Cs, Math].
Yang, Liu, Xuhui Meng, and George Em Karniadakis. 2021. β€œB-PINNs: Bayesian Physics-Informed Neural Networks for Forward and Inverse PDE Problems with Noisy Data.” Journal of Computational Physics 425 (January): 109913.
Yang, Liu, Dongkun Zhang, and George Em Karniadakis. 2020. β€œPhysics-Informed Generative Adversarial Networks for Stochastic Differential Equations.” SIAM Journal on Scientific Computing 42 (1): A292–317.
Zammit-Mangion, Andrew, Michael Bertolacci, Jenny Fisher, Ann Stavert, Matthew L. Rigby, Yi Cao, and Noel Cressie. 2021. β€œWOMBAT v1.0: A fully Bayesian global flux-inversion framework.” Geoscientific Model Development Discussions, July, 1–51.
Zhang, Dongkun, Ling Guo, and George Em Karniadakis. 2020. β€œLearning in Modal Space: Solving Time-Dependent Stochastic PDEs Using Physics-Informed Neural Networks.” SIAM Journal on Scientific Computing 42 (2): A639–65.
Zhang, Dongkun, Lu Lu, Ling Guo, and George Em Karniadakis. 2019. β€œQuantifying Total Uncertainty in Physics-Informed Neural Networks for Solving Forward and Inverse Stochastic Problems.” Journal of Computational Physics 397 (November): 108850.
Zhang, Xin, and Andrew Curtis. 2021. β€œBayesian Geophysical Inversion Using Invertible Neural Networks.” Journal of Geophysical Research: Solid Earth 126 (7): e2021JB022320.
Zhang, Zhongqiang, and George Em Karniadakis. 2017. Numerical Methods for Stochastic Partial Differential Equations with White Noise. Vol. 196. Applied Mathematical Sciences. Cham: Springer International Publishing.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.