Non-Gaussian Bayesian functional regression

Regression using non-Gaussian random fields. Generalised Gaussian process regression.

Is there ever an actual need for this? Or can we just use mostly-Gaussian process with some non-Gaussian distribution marginal and pretend, via GP quantile regression, or some variational GP approximation or non-Gaussian likelihood over Guaussian latents. Presumably if we suspect higher moments than the second are important, or that there is some actual stochastic process that we know matches our phenomenon, we might bother with this, but oh my it can get complicated.

TO: example, maybe using sparse stochastic process priors, Neural process regression Singh et al. (2019), ML PDEs.


Bostan, E., U. S. Kamilov, M. Nilchian, and M. Unser. 2013. “Sparse Stochastic Processes and Discretization of Linear Inverse Problems.” IEEE Transactions on Image Processing 22 (7): 2699–2710.
Louizos, Christos, Xiahan Shi, Klamer Schutte, and Max Welling. 2019. “The Functional Neural Process.” arXiv:1906.08324 [cs, Stat], June.
Singh, Gautam, Jaesik Yoon, Youngsung Son, and Sungjin Ahn. 2019. “Sequential Neural Processes.” arXiv:1906.10264 [cs, Stat], June.
Unser, M. 2015. “Sampling and (sparse) Stochastic Processes: A Tale of Splines and Innovation.” In 2015 International Conference on Sampling Theory and Applications (SampTA), 221–25.
Unser, M., P. D. Tafti, A. Amini, and H. Kirshner. 2014. “A Unified Formulation of Gaussian Vs Sparse Stochastic Processes - Part II: Discrete-Domain Theory.” IEEE Transactions on Information Theory 60 (5): 3036–51.
Unser, M., P. D. Tafti, and Q. Sun. 2014. “A Unified Formulation of Gaussian Vs Sparse Stochastic Processes—Part I: Continuous-Domain Theory.” IEEE Transactions on Information Theory 60 (3): 1945–62.
Unser, Michael A., and Pouya Tafti. 2014. An Introduction to Sparse Stochastic Processes. New York: Cambridge University Press.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.