# (Reproducing) kernel tricks

August 18, 2014 — July 21, 2023

**WARNING**: This is very old. If I were to write it now, I would write it differently, and specifically more pedagogically.

Kernel in the sense of the “kernel trick”. Not to be confused with smoothing-type convolution kernels, nor the dozens of related-but-slightly-different clashing definitions of *kernel*; those can have their own respective pages. Corollary: If you do not know what to name something, call it a kernel.

We are concerned with a particular flavour of kernel in Hilbert spaces, specifically *reproducing* or *Mercer* kernels (Mercer 1909). The associated function space is a *reproducing Kernel Hilbert Space*, which is hereafter an *RKHS*.

Kernel *tricks* comprise the application of Mercer kernels in Machine Learning. The “trick” part is that many machine learning algorithms operate on inner products. Or can be rewritten to work that way. Such algorithms permit one to swap out a boring classic Euclidean definition of that inner product in favour of a fancy RKHS one. The classic machine learning pitch for trying such a stunt is something like “upgrade your old boring linear algebra on finite (usually low-) dimensional spaces to sexy algebra on potentially-infinite-dimensional feature spaces, which still has a low-dimensional representation.” Or, if you’d like, “apply certain statistical learning methods based on things with an obvious finite vector space representation (\(\mathbb{R}^n\)) to things without one (Sentences, piano-rolls, \(\mathcal{C}^d_\ell\)).”

Mini history: The oft-cited origins of all the reproducing kernel stuff are (Aronszajn 1950; Mercer 1909). It took a while to percolate into random function theory (Khintchine 1934; Yaglom 1987) as covariance functions. Thence the idea arrived in statistical inference (Emanuel. Parzen 1962; E. Parzen 1963, 1959) and signal processing (Aasnaes and Kailath 1973; Duttweiler and Kailath 1973a, 1973b; Gevers and Kailath 1973; T. Kailath and Geesey 1971, 1973; T. Kailath 1971b, 1971a, 1974; T. Kailath, Geesey, and Weinert 1972; T. Kailath and Duttweiler 1972; T. Kailath and Weinert 1975), and now it is ubiquitous.

Practically, kernel methods have problems with scalability to large data sets. To apply any such method you need to keep a full Gram matrix of inner products between every data point, which needs you to know, for \(N\) data points, \(N(N-1)/2\) entries of a symmetric matrix. If you need to invert that matrix the cost is \(\mathcal{O}(N^3)\), which means you need fancy tricks to handle large \(N\). Fancy tricks depend on what the actual model is, but include Sparse GPs, random-projection inversions, Markov approximations and presumably many more

I’m especially interested in the application of such tricks in

- kernel regression
- wide random NNs
- Nonparametric kernel independence tests
~~Efficient kernel pre-image approximation~~~~Connection between kernel PCA and clustering (Schölkopf et al. 1998; Williams 2001)~~*Turns out not all those applications are interesting to me.*

## 1 Introductions

There are many primers on Mercer kernels and their connection to ML. Kenneth Tay’s intro is punchy. Il Shan Ng, Reproducing Kernel Hilbert Spaces & Machine Learning is good. See (Schölkopf and Smola 2002), which grinds out many connections with learning theory, or (Manton and Amblard 2015), which is more narrowly focussed on just the Mercer-kernel part, and the topological and geometric properties of the spaces. (Ghojogh et al. 2021; Gori and Martínez-Herrero 2021; Gretton 2019). Cheney and Light (2009) is an approximation-theory perspective which does not especially concern itself with stochastic processes. I also seem to have bookmarked the following introductions (Vert, Tsuda, and Schölkopf 2004; Schölkopf et al. 1999; Schölkopf, Herbrich, and Smola 2001; Muller et al. 2001; Schölkopf and Smola 2003).

Alex Smola (who with, Bernhard Schölkopf) has his name on an intimidating proportion of publications in this area, also has all his publications online.

## 2 Kernel approximation

See kernel approximation.

## 3 RKHS distribution embedding

## 4 Specific kernels

See covariance functions.

## 5 Non-scalar-valued “kernels”

Extending the usual inner-product framing, *Operator-valued kernels*, (Micchelli and Pontil 2005a; Evgeniou, Micchelli, and Pontil 2005; Álvarez, Rosasco, and Lawrence 2012), generalise to \(k:\mathcal{X}\times \mathcal{X}\mapsto \mathcal{L}(H_Y)\), as seen in multi-task learning.

## 6 Tools

### 6.1 KeOps

File under least squares, autodiff, gps, pytorch.

The KeOps library lets you compute reductions of large arrays whose entries are given by a mathematical formula or a neural network. It combines efficient C++ routines with an automatic differentiation engine and can be used with Python (NumPy, PyTorch), Matlab and R.

It is perfectly suited to the computation of kernel matrix-vector products, K-nearest neighbors queries, N-body interactions, point cloud convolutions and the associated gradients. Crucially, it performs well even when the corresponding kernel or distance matrices do not fit into the RAM or GPU memory. Compared with a PyTorch GPU baseline, KeOps provides a x10-x100 speed-up on a wide range of geometric applications, from kernel methods to geometric deep learning.

### 6.2 Falkon

A Python library for large-scale kernel methods, with optional (multi-)GPU acceleration.

The library currently includes two solvers: one for approximate kernel ridge regression Rudi, Carratino, and Rosasco (2017) which is extremely fast, and one for kernel logistic regression Marteau-Ferey, Bach, and Rudi (2019) which trades off lower speed for better accuracy on binary classification problems.

The main features of Falkon are:

Full multi-GPU support- All compute-intensive parts of the algorithms are multi-GPU capable.Extreme scalability- Unlike other kernel solvers, we keep memory usage in check. We have tested the library with datasets of billions of points.Sparse data supportScikit-learn integration- Our estimators follow the scikit-learn API

## 7 References

*IEEE Transactions on Automatic Control*.

*Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics*.

*arXiv:2106.12408 [Stat]*.

*Proceedings of the 36th International Conference on Machine Learning*.

*arXiv:1411.0306 [Cs, Stat]*.

*Proceedings of the 20th Conference on Uncertainty in Artificial Intelligence*. UAI ’04.

*Foundations and Trends® in Machine Learning*.

*Proceedings of the 33rd International Conference on Neural Information Processing Systems*.

*Transactions of the American Mathematical Society*.

*Proceedings of the 21st International Conference on Neural Information Processing Systems*. NIPS’08.

*COLT*.

*arXiv:1704.02958 [Cs, Stat]*.

*Pattern Recognition*. Lecture Notes in Computer Science 3175.

*arXiv:1606.05241 [Stat]*.

*PLoS Comput Biol*.

*Inference and prediction in large dimensions*. Wiley series in probability and statistics.

*arXiv:1806.09810 [Cs, Math]*.

*The Annals of Statistics*.

*Advances in Kernel Methods - Support Vector Learning*.

*Neurocomputing*.

*Machine Learning*.

*Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.*

*A Course in Approximation Theory*.

*arXiv:1605.09049 [Cs, Stat]*.

*Proceedings of the 33rd International Conference on International Conference on Machine Learning - Volume 48*. ICML’16.

*Machine Learning: ECML 2006*. Lecture Notes in Computer Science 4212.

*Grammatical Inference: Algorithms and Applications*. Lecture Notes in Computer Science 4201.

*Fundamenta Informaticae*.

*Advances in Neural Information Processing Systems 14*.

*Journal of Machine Learning Research*.

*Bulletin of the American Mathematical Society*.

*Proceedings of the 25th International Conference on Machine Learning*. ICML ’08.

*SIAM Journal on Control*.

*arXiv:1408.5810 [Stat]*.

*A Probabilistic Theory of Pattern Recognition*.

*arXiv:2012.00152 [Cs, Stat]*.

*Journal of Machine Learning Research*.

*IEEE Transactions on Information Theory*.

*IEEE Transactions on Information Theory*.

*Proceedings of the 30th International Conference on Machine Learning (ICML-13)*.

*Journal of Machine Learning Research*.

*Conference on Learning Theory*.

*Irish Signals & Systems Conference 2014 and 2014 China-Ireland International Conference on Information and Communications Technologies (ISSC 2014/CIICT 2014). 25th IET*.

*arXiv:1610.08623 [Stat]*.

*1975 IEEE Conference on Decision and Control Including the 14th Symposium on Adaptive Processes*.

*Journal of Machine Learning Research*.

*IEEE Transactions on Automatic Control*.

*arXiv:1606.05316 [Cs]*.

*arXiv:2007.02857 [Cs, Math, Stat]*.

*JOSA A*.

*arXiv:2007.07383 [Physics, Stat]*.

*Tenth IEEE International Conference on Computer Vision, 2005. ICCV 2005*.

*SIAM Journal on Scientific and Statistical Computing*.

*The Journal of Machine Learning Research*.

*Advances in Neural Information Processing Systems 20: Proceedings of the 2007 Conference*.

*Proceedings of the Conference on Uncertainty in Artificial Intelligence*.

*A Distribution-Free Theory of Nonparametric Regression*. Springer Series in Statistics.

*arXiv:1411.5172 [Cs, Stat]*.

*The Annals of Statistics*.

*arXiv:1805.12324 [Cs, Math, Stat]*.

*Journal of Machine Learning Research*.

*Advances in Neural Information Processing Systems 29*.

*The Annals of Mathematical Statistics*.

*IEEE Transactions on Information Theory*.

*1971 IEEE Conference on Decision and Control*.

*IEEE Transactions on Information Theory*.

*IEEE Transactions on Information Theory*.

*IEEE Transactions on Automatic Control*.

*IEEE Transactions on Automatic Control*.

*IEEE Transactions on Information Theory*.

*IEEE Transactions on Information Theory*.

*Journal of Machine Learning Research*.

*arXiv:1807.02582 [Cs, Stat]*.

*arXiv:2006.16236 [Cs, Stat]*.

*IEEE Transactions on Information Theory*.

*2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)*.

*Mathematische Annalen*.

*The Annals of Mathematical Statistics*.

*Machine Learning and Knowledge Discovery in Databases*. Lecture Notes in Computer Science.

*The Journal of Chemical Physics*.

*Algorithmic Learning Theory*. Lecture Notes in Computer Science 4264.

*Theoretical Computer Science*, Algorithmic Learning Theory,.

*arXiv:1612.04111 [Cs, Stat]*.

*UAI17*.

*IEEE Transactions on Pattern Analysis and Machine Intelligence*.

*Proceedings of the 16th Annual Conference on Neural Information Processing Systems*.

*Probability Surveys*.

*Proceedings of The 33rd International Conference on Machine Learning*.

*Mathematical Problems in Engineering*.

*IEEE Transactions on Information Theory*.

*1975 IEEE Conference on Decision and Control Including the 14th Symposium on Adaptive Processes*.

*Twenty-Eighth AAAI Conference on Artificial Intelligence*.

*Journal of Machine Learning Research*.

*arXiv:1605.08179 [Cs, Stat]*.

*Proceedings of the 25th International Conference on Machine Learning*. ICML ’08.

*arXiv:1703.10622 [Cs, Stat]*.

*Proceedings of the AAAI Conference on Artificial Intelligence*.

*Foundations and Trends® in Signal Processing*.

*Advances in Neural Information Processing Systems*.

*Proceedings of the 34th International Conference on Neural Information Processing Systems*. NIPS ’20.

*IEEE Conference on Computer Vision and Pattern Recognition (CVPR)*.

*Proceedings of the 34th International Conference on Neural Information Processing Systems*. NIPS’20.

*Journal of Mathematical Analysis and Applications*.

*Philosophical Transactions of the Royal Society of London. Series A, Containing Papers of a Mathematical or Physical Character*.

*Journal of Machine Learning Research*.

*Neural Computation*.

*SIAM/ASA Journal on Uncertainty Quantification*.

*arXiv:1405.5505 [Cs, Stat]*.

*Foundations and Trends® in Machine Learning*.

*IEEE Transactions on Neural Networks*.

*The Journal of Machine Learning Research*.

*Journal of the Society for Industrial and Applied Mathematics Series A Control*.

*Proceedings of the Symposium on Time Series Analysis*.

*arXiv:1612.09158 [Cs, Stat]*.

*Proceedings of the IEEE*.

*Advances in Neural Information Processing Systems*.

*Advances in Neural Information Processing Systems*.

*arXiv:1406.1922 [Stat]*.

*Proceedings of the 31st International Conference on Neural Information Processing Systems*. NIPS’17.

*Gaussian Markov Random Fields: Theory and Applications*. Monographs on Statistics and Applied Probability 104.

*Stat*.

*Advances in Neural Information Processing Systems*.

*Artificial Neural Networks and Machine Learning – ICANN 2011*. Lecture Notes in Computer Science.

*Acta Numerica*.

*arXiv:1809.10284 [Cs, Math, Stat]*.

*Computational Learning Theory*. Lecture Notes in Computer Science.

*Mustererkennung 1998*. Informatik Aktuell.

*IEEE Transactions on Neural Networks*.

*arXiv:1501.06794 [Cs, Stat]*.

*Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond*.

*Advanced Lectures on Machine Learning*. Lecture Notes in Computer Science 2600.

*Artificial Neural Networks — ICANN’97*. Lecture Notes in Computer Science.

*arXiv:1905.11255 [Cs, Math, Stat]*.

*ECML-PKDD 2017*.

*IEEE Transactions on Information Theory*.

*IEEE Transactions on Information Theory*.

*arXiv:1610.06551 [Stat]*.

*Algorithmica*.

*Statistics and Computing*.

*Neural Networks*.

*Advances in Neural Information Processing Systems*.

*Statistics and Computing*.

*Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics*.

*Proceedings of the 21st Annual Conference on Learning Theory (COLT 2008)*.

*arXiv:2002.03171 [Cs, Math]*.

*The Annals of Applied Statistics*.

*The Annals of Statistics*.

*Advances in Neural Information Processing Systems 13*.

*Proceedings of the AAAI Conference on Artificial Intelligence*.

*IEEE Transactions on Pattern Analysis and Machine Intelligence*.

*Kernel Methods in Computational Biology*.

*Journal of Machine Learning Research*.

*Proceedings of the 25th International Conference on Machine Learning*. ICML ’08.

*Computer Graphics Forum*.

*Proceedings of the 31st International Conference on International Conference on Machine Learning - Volume 32*. ICML’14.

*Communications in Statistics - Simulation and Computation*.

*The Annals of Statistics*.

*IEEE Transactions on Information Theory*.

*Advances in Neural Information Processing Systems 13*.

*International Conference on Machine Learning*.

*arXiv:1510.07389 [Cs, Stat]*.

*Computers & Mathematics with Applications*.

*International Conference on Artificial Intelligence and Statistics*.

*arXiv:2103.00895 [Stat]*.

*IEEE Transactions on Signal Processing*.

*arXiv:2103.00580 [Stat]*.

*Correlation Theory of Stationary and Related Random Functions. Volume II: Supplementary Notes and References*. Springer Series in Statistics.

*Advances in Neural Information Processing Systems*.

*Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2*. ICCV ’03.

*Advances in Neural Information Processing Systems*.

*arXiv:1412.8293 [Cs, Math, Stat]*.

*Proceedings of the 30th International Conference on Machine Learning (ICML-13)*.

*arXiv:1606.07892 [Stat]*.

*arXiv:1202.3775 [Cs, Stat]*.

*Proceedings of the 30th International Conference on Machine Learning (ICML-13)*.