Gaussian processes on lattices

October 30, 2019 — February 28, 2022

Figure 1

\[ \renewcommand{\var}{\operatorname{Var}} \renewcommand{\dd}{\mathrm{d}} \renewcommand{\pd}{\partial} \renewcommand{\vv}[1]{\boldsymbol{#1}} \renewcommand{\mm}[1]{\mathrm{#1}} \renewcommand{\mmm}[1]{\mathrm{#1}} \renewcommand{\cc}[1]{\mathcal{#1}} \renewcommand{\oo}[1]{\operatorname{#1}} \renewcommand{\gvn}{\mid} \renewcommand{\II}[1]{\mathbb{I}\{#1\}} \renewcommand{\inner}[2]{\langle #1,#2\rangle} \renewcommand{\Inner}[2]{\left\langle #1,#2\right\rangle} \renewcommand{\norm}[1]{\| #1\|} \renewcommand{\Norm}[1]{\|\langle #1\right\|} \renewcommand{\argmax}{\operatorname{arg max}} \renewcommand{\argmin}{\operatorname{arg min}} \renewcommand{\omp}{\mathop{\mathrm{OMP}}} \]

Gaussian Processes with a stationary kernel are faster if you are working on a grid of points. I have not used this trick, but I understand it involves various simplifications arising from the structure of Gram matrices which end up being Kronecker products of Toeplitz matrices under lexical ordering of the input points, which is long words to describe a fairly simple thing. The keyword to highlight this is Kronecker inference in the ML literature (e.g. Saatçi 2012; Flaxman et al. 2015; A. G. Wilson and Nickisch 2015). Another keyword is circulant embedding, the approximation of a Toeplitz matrix by a circulant matrix, which apparently enables one to leverage fast Fourier transforms to calculate some quantities of interest, and some other nice linear algebra properties besides. That would imply that this method is has its origins in Whittle likelihoods (P. Whittle 1953a; Peter Whittle 1952), if I am not mistaken.

Lattice GP methods complements, perhaps, the trick of filtering Gaussian processes, which can also exploit structure lattice inputs, although the setup is different between these.

TBC.

1 Simulating lattice Gaussian fields with the desired covariance structure

See GP simulation on lattices.

2 In regression

Figure 2

OK, so what does this allow us to do with posterior inference in GPs? Apparently quite a lot. The KISS-GP method (A. G. Wilson and Nickisch 2015) presumably leverages something similar. So do Saatçi (2012) and Flaxman et al. (2015).

Flaxman et al. (2015) deals with the lattice structure trick in a non-Gaussian likelihood setting using Laplace approximation.

TBC

3 Incoming

4 References

Abrahamsen, Kvernelv, and Barker. 2018. Simulation Of Gaussian Random Fields Using The Fast Fourier Transform (Fft).” In.
Alexanderian. 2015. A Brief Note on the Karhunen-Loève Expansion.” arXiv:1509.07526 [Math].
Chan, Grace, and Wood. 1997. Algorithm AS 312: An Algorithm for Simulating Stationary Gaussian Random Fields.” Journal of the Royal Statistical Society: Series C (Applied Statistics).
Chan, G., and Wood. 1999. Simulation of Stationary Gaussian Vector Fields.” Statistics and Computing.
Charlier, Feydy, Glaunès, et al. 2021. Kernel Operations on the GPU, with Autodiff, Without Memory Overflows.” Journal of Machine Learning Research.
Choromanski, and Sindhwani. 2016. Recycling Randomness with Structure for Sublinear Time Kernel Expansions.” arXiv:1605.09049 [Cs, Stat].
Choudhuri, Ghosal, and Roy. 2004. Contiguity of the Whittle Measure for a Gaussian Time Series.” Biometrika.
Cotter, Roberts, Stuart, et al. 2013. MCMC Methods for Functions: Modifying Old Algorithms to Make Them Faster.” Statistical Science.
Davies, and Bryant. 2013. On Circulant Embedding for Gaussian Random Fields in R.” Journal of Statistical Software.
Dietrich, and Newsam. 1993. A Fast and Exact Method for Multidimensional Gaussian Stochastic Simulations.” Water Resources Research.
———. 1997. Fast and Exact Simulation of Stationary Gaussian Processes Through Circulant Embedding of the Covariance Matrix.” SIAM Journal on Scientific Computing.
Durrande, Adam, Bordeaux, et al. 2019. Banded Matrix Operators for Gaussian Markov Models in the Automatic Differentiation Era.” In Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics.
Ellis, and Lay. 1992. Factorization of Finite Rank Hankel and Toeplitz Matrices.” Linear Algebra and Its Applications.
Erhel, Oumouni, Pichot, et al. n.d. “Analysis of Continuous Spectral Method for Sampling Stationary Gaussian Random Fields.”
Flaxman, Wilson, Neill, et al. 2015. “Fast Kronecker Inference in Gaussian Processes with Non-Gaussian Likelihoods.” In.
Gilboa, Saatçi, and Cunningham. 2015. Scaling Multidimensional Inference for Structured Gaussian Processes.” IEEE Transactions on Pattern Analysis and Machine Intelligence.
Graham, Kuo, Nuyens, et al. 2017a. Analysis of Circulant Embedding Methods for Sampling Stationary Random Fields.” arXiv:1710.00751 [Math].
———, et al. 2017b. Circulant Embedding with QMC — Analysis for Elliptic PDE with Lognormal Coefficients.” arXiv:1710.09254 [Math].
Gray. 2006. Toeplitz and Circulant Matrices: A Review.
Guinness, and Fuentes. 2016. Circulant Embedding of Approximate Covariances for Inference From Gaussian Data on Large Lattices.” Journal of Computational and Graphical Statistics.
Heinig, and Rost. 2011. Fast Algorithms for Toeplitz and Hankel Matrices.” Linear Algebra and Its Applications.
Kroese, and Botev. 2013. Spatial Process Generation.” arXiv:1308.0399 [Stat].
Lang, and Potthoff. 2011. Fast Simulation of Gaussian Random Fields.” Monte Carlo Methods and Applications.
Loper, Blei, Cunningham, et al. 2021. Linear-Time Inference for Gaussian Processes on One Dimension.” arXiv:2003.05554 [Cs, Stat].
Nowak, and Litvinenko. 2013. Kriging and Spatial Design Accelerated by Orders of Magnitude: Combining Low-Rank Covariance Approximations with FFT-Techniques.” Mathematical Geosciences.
Pleiss, Gardner, Weinberger, et al. 2018. Constant-Time Predictive Distributions for Gaussian Processes.” In.
Powell. 2014. “Generating Realisations of Stationary Gaussian Random Fields by Circulant Embedding.” Matrix.
Rue, Havard. 2001. Fast Sampling of Gaussian Markov Random Fields.” Journal of the Royal Statistical Society. Series B (Statistical Methodology).
Rue, Håvard, and Held. 2005. Gaussian Markov Random Fields: Theory and Applications. Monographs on Statistics and Applied Probability 104.
Saatçi. 2012. Scalable inference for structured Gaussian process models.”
Saatçi, Turner, and Rasmussen. 2010. Gaussian Process Change Point Models.” In Proceedings of the 27th International Conference on International Conference on Machine Learning. ICML’10.
Sigrist, Künsch, and Stahel. 2015a. Spate : An R Package for Spatio-Temporal Modeling with a Stochastic Advection-Diffusion Process.” Application/pdf. Journal of Statistical Software.
———. 2015b. Stochastic Partial Differential Equation Based Modelling of Large Space-Time Data Sets.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Stroud, Stein, and Lysen. 2017. Bayesian and Maximum Likelihood Estimation for Gaussian Processes on an Incomplete Lattice.” Journal of Computational and Graphical Statistics.
Teichmann, and van den Boogaart. 2016. Efficient Simulation of Stationary Multivariate Gaussian Random Fields with Given Cross-Covariance.” Applied Mathematics.
Ubaru, Chen, and Saad. 2017. Fast Estimation of \(tr(f(A))\) via Stochastic Lanczos Quadrature.” SIAM Journal on Matrix Analysis and Applications.
Whittle, Peter. 1952. Some Results in Time Series Analysis.” Scandinavian Actuarial Journal.
Whittle, P. 1952. Tests of Fit in Time Series.” Biometrika.
———. 1953a. The Analysis of Multiple Stationary Time Series.” Journal of the Royal Statistical Society: Series B (Methodological).
———. 1953b. Estimation and Information in Stationary Time Series.” Arkiv För Matematik.
Whittle, P. 1954. “On Stationary Processes in the Plane.” Biometrika.
Wilson, James T, Borovitskiy, Terenin, et al. 2021. Pathwise Conditioning of Gaussian Processes.” Journal of Machine Learning Research.
Wilson, Andrew Gordon, Dann, and Nickisch. 2015. Thoughts on Massively Scalable Gaussian Processes.” arXiv:1511.01870 [Cs, Stat].
Wilson, Andrew Gordon, and Nickisch. 2015. Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP).” In Proceedings of the 32Nd International Conference on International Conference on Machine Learning - Volume 37. ICML’15.
Ye, and Lim. 2016. Every Matrix Is a Product of Toeplitz Matrices.” Foundations of Computational Mathematics.