# Kernel warping

A nonlinear way of transforming stationary kernels into non-stationary ones by transforming their inputs .

This is of interest in the context of composing kernels to have known desirable properties by known transforms, and also learning (somewhat) arbitrary transforms to attain stationarity.

## Stationary reducible kernels

The main idea is to find a new feature space where stationarity or local stationarity can be achieved.

summarises:

We say that a nonstationary kernel $$K(\mathbf{x}, \mathbf{z})$$ is stationary reducible if there exist a bijective deformation $$\Phi$$ such that: $K(\mathbf{x}, \mathbf{z})=K_{S}^{*}(\mathbf{\Phi}(\mathbf{x})-\mathbf{\Phi}(\mathbf{z}))$ where $$K_{S}^{*}$$ is a stationary kernel.

## Classic deformations

### As a function of input

Invented apparently by and generalised in .

Let $$k_S$$ be some stationary kernel on $$\mathbb{R}^D.$$ Let $$\Sigma(\mathbf{x})$$ be a $$D \times D$$ matrix-valued function which is positive definite for all $$\mathbf{x},$$ and let $$\Sigma_{i} \triangleq \Sigma\left(\mathbf{x}_{i}\right) .$$ ) Then define $Q_{i j}=\left(\mathbf{x}_{i}-\mathbf{x}_{j}\right)^{\top}\left(\left(\Sigma_{i}+\Sigma_{j}\right) / 2\right)^{-1}\left(\mathbf{x}_{i}-\mathbf{x}_{j}\right)$ Then $k_{\mathrm{NS}}\left(\mathbf{x}_{i}, \mathbf{x}_{j}\right)=2^{D / 2}\left|\Sigma_{i}\right|^{1 / 4}\left|\Sigma_{j}\right|^{1 / 4}\left|\Sigma_{i}+\Sigma_{j}\right|^{-1 / 2} k_{\mathrm{S}}\left(\sqrt{Q_{i j}}\right)$ is a valid non-stationary covariance function.

Homework question: Is this a product of convolutional gaussian processes.

## References

Belkin, Mikhail, Siyuan Ma, and Soumik Mandal. 2018. “To Understand Deep Learning We Need to Understand Kernel Learning.” In International Conference on Machine Learning, 541–49. http://arxiv.org/abs/1802.01396.
Bohn, Bastian, Michael Griebel, and Christian Rieger. 2018. “A Representer Theorem for Deep Kernel Learning.” June 7, 2018. http://arxiv.org/abs/1709.10441.
Damian, Doris, Paul D. Sampson, and Peter Guttorp. 2001. “Bayesian Estimation of Semi-Parametric Non-Stationary Spatial Covariance Structures.” Environmetrics 12 (2): 161–78. https://doi.org/10.1002/1099-095X(200103)12:2<161::AID-ENV452>3.0.CO;2-G.
Feragen, Aasa, and Søren Hauberg. n.d. “Open Problem: Kernel Methods on Manifolds and Metric Spaces,” 4.
Genton, Marc G. 2001. “Classes of Kernels for Machine Learning: A Statistics Perspective.” Journal of Machine Learning Research 2 (December): 299–312. http://jmlr.org/papers/volume2/genton01a/genton01a.pdf.
Genton, Marc G., and Olivier Perrin. 2004. “On a Time Deformation Reducing Nonstationary Stochastic Processes to Local Stationarity.” Journal of Applied Probability 41 (1, 1): 236–49. https://doi.org/10.1239/jap/1077134681.
Gibbs, M. N. 1998. “Bayesian Gaussian Processes for Regression and Classification.” Ph.D., University of Cambridge. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.599379.
Hinton, Geoffrey E, and Ruslan R Salakhutdinov. 2008. “Using Deep Belief Nets to Learn Covariance Kernels for Gaussian Processes.” In Advances in Neural Information Processing Systems 20, edited by J. C. Platt, D. Koller, Y. Singer, and S. T. Roweis, 1249–56. Curran Associates, Inc. http://papers.nips.cc/paper/3211-using-deep-belief-nets-to-learn-covariance-kernels-for-gaussian-processes.pdf.
Ikeda, Masahiro, Isao Ishikawa, and Yoshihiro Sawano. 2021. “Composition Operators on Reproducing Kernel Hilbert Spaces with Analytic Positive Definite Functions.” March 9, 2021. http://arxiv.org/abs/1911.11992.
Paciorek, Christopher J., and Mark J. Schervish. 2003. “Nonstationary Covariance Functions for Gaussian Process Regression.” In Proceedings of the 16th International Conference on Neural Information Processing Systems, 16:273–80. NIPS’03. Cambridge, MA, USA: MIT Press. https://papers.nips.cc/paper/2003/hash/326a8c055c0d04f5b06544665d8bb3ea-Abstract.html.
Perrin, Olivier, and Rachid Senoussi. 1999. “Reducing Non-Stationary Stochastic Processes to Stationarity by a Time Deformation.” Statistics & Probability Letters 43 (4): 393–97. https://doi.org/10.1016/S0167-7152(98)00278-8.
———. 2000. “Reducing Non-Stationary Random Fields to Stationarity and Isotropy Using a Space Deformation.” Statistics & Probability Letters 48 (1): 23–32. https://doi.org/10.1016/S0167-7152(99)00188-1.
Rasmussen, Carl Edward, and Christopher K. I. Williams. 2006. Gaussian Processes for Machine Learning. Adaptive Computation and Machine Learning. Cambridge, Mass: Max-Planck-Gesellschaft; MIT Press. http://www.gaussianprocess.org/gpml/.
Sampson, Paul D., and Peter Guttorp. 1992. “Nonparametric Estimation of Nonstationary Spatial Covariance Structure.” Journal of the American Statistical Association 87 (417): 108–19. https://doi.org/10.1080/01621459.1992.10475181.
Schmidt, Alexandra M., and Anthony O’Hagan. 2003. “Bayesian Inference for Non-Stationary Spatial Covariance Structure via Spatial Deformations.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 65 (3): 743–58. https://doi.org/10.1111/1467-9868.00413.
Shimotsu, Katsumi, and Peter C. B. Phillips. 2004. “Local Whittle Estimation in Nonstationary and Unit Root Cases.” The Annals of Statistics 32 (2): 656–92. https://doi.org/10.1214/009053604000000139.
Snoek, Jasper, Kevin Swersky, Rich Zemel, and Ryan Adams. 2014. “Input Warping for Bayesian Optimization of Non-Stationary Functions.” In Proceedings of the 31st International Conference on Machine Learning (ICML-14), 1674–82. http://www.jmlr.org/proceedings/papers/v32/snoek14.pdf.
Tompkins, Anthony, and Fabio Ramos. 2018. “Fourier Feature Approximations for Periodic Kernels in Time-Series Modelling.” Proceedings of the AAAI Conference on Artificial Intelligence 32 (1, 1). https://ojs.aaai.org/index.php/AAAI/article/view/11696.
Wilson, Andrew Gordon, Zhiting Hu, Ruslan Salakhutdinov, and Eric P. Xing. 2016. “Deep Kernel Learning.” In Artificial Intelligence and Statistics, 370–78. PMLR. http://proceedings.mlr.press/v51/wilson16.html.

Warning! Experimental comments system! If is does not work for you, let me know via the contact form.