Regression using non-Gaussian random fields.
Generalised Gaussian process regression.
Is there ever an actual need for this?
Or can we just use mostly-Gaussian process with some non-Gaussian distribution
marginal and pretend, via GP quantile regression, or some variational GP approximation or non-Gaussian likelihood over Guaussian latents.
Presumably if we suspect higher moments than the second are important, or that there is some actual stochastic process that we know matches our phenomenon, we might
bother with this, but oh my it can get complicated.
TO: example, maybe using sparse stochastic process priors,
Neural process regression
Singh et al. (2019),
Bostan, E., U. S. Kamilov, M. Nilchian, and M. Unser. 2013. “Sparse Stochastic Processes and Discretization of Linear Inverse Problems.” IEEE Transactions on Image Processing
22 (7): 2699–2710. https://doi.org/10.1109/TIP.2013.2255305
Louizos, Christos, Xiahan Shi, Klamer Schutte, and Max Welling. 2019. “The Functional Neural Process.”
June 19, 2019. http://arxiv.org/abs/1906.08324
Singh, Gautam, Jaesik Yoon, Youngsung Son, and Sungjin Ahn. 2019. “Sequential Neural Processes.”
June 24, 2019. http://arxiv.org/abs/1906.10264
Unser, M. 2015. “Sampling and (sparse) Stochastic Processes: A Tale of Splines and Innovation.”
In 2015 International Conference on Sampling Theory and Applications (SampTA)
, 221–25. https://doi.org/10.1109/SAMPTA.2015.7148884
Unser, M., P. D. Tafti, A. Amini, and H. Kirshner. 2014. “A Unified Formulation of Gaussian Vs Sparse Stochastic Processes - Part II: Discrete-Domain Theory.” IEEE Transactions on Information Theory
60 (5): 3036–51. https://doi.org/10.1109/TIT.2014.2311903
Unser, M., P. D. Tafti, and Q. Sun. 2014. “A Unified Formulation of Gaussian Vs Sparse Stochastic Processes—Part I: Continuous-Domain Theory.” IEEE Transactions on Information Theory
60 (3): 1945–62. https://doi.org/10.1109/TIT.2014.2298453
Unser, Michael A., and Pouya Tafti. 2014. An Introduction to Sparse Stochastic Processes
. New York
: Cambridge University Press