Inverse problems where the model parameter is in some function space. For me this usually implies a spatiotemporal model, usually in the context of PDE solvers, particularly approximate ones.
Inverse problems arise naturally in tomography, compressed sensing, deconvolution, inverting PDEs and many other areas.
Suppose I have a PDE, possibly with some unknown parameters in the driving equation. All being equal I can do not too badly at approximating that with tools already mentioned. What if I wish to simultaneously infer some unknown inputs? Then we consider it as an inverse problem. This is not quite the same as the predictive problem that many of the methods consider. However, we are free to use simulation-based inference to solve, or MCMC methods to do so for any of the forward-operator-learning approaches. To train the model to solve the inverse problem directly, we might consider GANs or variational inference. At this point we are more or less required to start using a probabilistic network or we will miss essential uncertainty quantification.
We might also be inclined to use approximate methods. We are surprised to find anything like a clean closed-form solution for the posterior distribution of some parameter in a PDE. Why would it?
As for how we might proceed, Liu, Yeo, and Lu (2020) is one approach, generalizing the approach of F. Sigrist, KΓΌnsch, and Stahel (2015b), but for advection/diffusion equations specifically. Generic methods include Bao et al. (2020); Jo et al. (2020); Lu, Mao, and Meng (2019); Raissi, Perdikaris, and Karniadakis (2019); Tait and Damoulas (2020); Xu and Darve (2020); Yang, Zhang, and Karniadakis (2020); Zhang, Guo, and Karniadakis (2020); Zhang et al. (2019).
Bayesian nonparametrics
Since this kind of problem naturally invites functional parameters, we are in the world of Bayesian nonparametrics, which has a slightly different notation than you usually see in Bayes textbooks. I suspect that there is a useful role for various Bayesian nonparametrics here, but the easiest of all is Gaussian process, which I handle next:
Gaussian process parameters
Alexanderian (2021) states a βwell-knownβ result, that the solution of a Bayesian linear inverse problem with Gaussian prior and noise models is a Gaussian posterior \(\mu_{\text {post }}^{y}=\mathcal{N}\left(m_{\text {MAP }}, \mathcal{C}_{\text {post }}\right)\), where \[ \mathcal{C}_{\text {post }}=\left(\mathcal{F}^{*} \boldsymbol{\Gamma}_{\text {noise }}^{-1} \mathcal{F}+\mathcal{C}_{\text {pr }}^{-1}\right)^{-1} \quad \text { and } \quad m_{\text {MAP }}=\mathcal{C}_{\text {post }}\left(\mathcal{F}^{*} \boldsymbol{\Gamma}_{\text {noise }}^{-1} \boldsymbol{y}+\mathcal{C}_{\text {pr }}^{-1} m_{\text {MAP }}\right). \]
Finite Element Models
Finite Element Models of PDEs of PDEs (annd possibly other representation?) of PDES can be expressed as locally-linear constraints and thus expressed using Gaussian Belief Propagation (Y. El-Kurdi et al. 2016; Y. M. El-Kurdi 2014; Y. El-Kurdi et al. 2015). Note that in this setting, there is nothing special about the inversion process. Inference proceeds the same either way, as a variational message passing algorithm.
No comments yet. Why not leave one?