t-processes



Stochastic processes with Student-t marginals. Much as Student-\(t\) distributions generalise Gaussian distributions, \(t\)-processes generalise Gaussian processes.

t-processes regression

There are a couple of classic cases in ML where \(t\)-processes arise, e.g. in Bayes NNs (Neal 1996) or GP literature (9.9 Rasmussen and Williams 2006). Recently there has been an uptick in actual applications of these processes in regression (Chen, Wang, and Gorban 2020; Shah, Wilson, and Ghahramani 2014; Tang et al. 2017; Tracey and Wolpert 2018). See Wilson and Ghahramani (2011) for a Generalized Wishart Process construction that may be helpful? This prior is available in GPyTorch. Recent papers (Shah, Wilson, and Ghahramani 2014; Tracey and Wolpert 2018) make it seem fairly straightforward.

I am interested in seeing if these can be pressed into service as a model for mis-specification in Gaussian process regression.

Some papers discuss this in term of inference using Inverse Wishart

Markov t-process

Process with t-distributed increments is in fact a Lévy process, which follows from the fact that the Student-\(t\) distribution is divisible. As far as I can see here Grigelionis (2013) is the definitive collation of results on that observation.

References

Chen, Zexun, Bo Wang, and Alexander N. Gorban. 2020. “Multivariate Gaussian and Student-t Process Regression for Multi-Output Prediction.” Neural Computing and Applications 32 (8): 3005–28. https://doi.org/10.1007/s00521-019-04687-8.
Grigelionis, Bronius. 2013. Student’s t-Distribution and Related Stochastic Processes. SpringerBriefs in Statistics. Berlin, Heidelberg: Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-31146-8.
Grosswald, E. 1976. “The Student t-Distribution of Any Degree of Freedom Is Infinitely Divisible.” Zeitschrift Für Wahrscheinlichkeitstheorie Und Verwandte Gebiete 36 (2): 103–9. https://doi.org/10.1007/BF00533993.
Ismail, Mourad E. H. 1977. “Bessel Functions and the Infinite Divisibility of the Student \(t\)- Distribution.” The Annals of Probability 5 (4): 582–85. https://doi.org/10.1214/aop/1176995766.
Neal, Radford M. 1996. “Bayesian Learning for Neural Networks.” Secaucus, NJ, USA: Springer-Verlag New York, Inc. http://www.csri.utoronto.ca/~radford/ftp/thesis.pdf.
Rasmussen, Carl Edward, and Christopher K. I. Williams. 2006. Gaussian Processes for Machine Learning. Adaptive Computation and Machine Learning. Cambridge, Mass: MIT Press. http://www.gaussianprocess.org/gpml/.
Shah, Amar, Andrew Wilson, and Zoubin Ghahramani. 2014. “Student-t Processes as Alternatives to Gaussian Processes.” In Artificial Intelligence and Statistics, 877–85. PMLR. http://proceedings.mlr.press/v33/shah14.html.
Tang, Qingtao, Li Niu, Yisen Wang, Tao Dai, Wangpeng An, Jianfei Cai, and Shu-Tao Xia. 2017. “Student-t Process Regression with Student-t Likelihood,” 2822–28. https://www.ijcai.org/proceedings/2017/393.
Tracey, Brendan D., and David H. Wolpert. 2018. “Upgrading from Gaussian Processes to Student’s-T Processes.” 2018 AIAA Non-Deterministic Approaches Conference, January. https://doi.org/10.2514/6.2018-1659.
Wilson, Andrew Gordon, and Zoubin Ghahramani. 2011. “Generalised Wishart Processes.” In Proceedings of the Twenty-Seventh Conference on Uncertainty in Artificial Intelligence, 736–44. UAI’11. Arlington, Virginia, United States: AUAI Press. http://dl.acm.org/citation.cfm?id=3020548.3020633.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.