# Representer theorems

September 16, 2019 — September 16, 2019

approximation

functional analysis

Hilbert space

kernel tricks

metrics

statistics

In spatial statistics, Gaussian processes, kernel machines and covariance functions, regularisation.

🏗

## 1 References

Bohn, Griebel, and Rieger. 2018. “A Representer Theorem for Deep Kernel Learning.”

*arXiv:1709.10441 [Cs, Math]*.
Boyer, Chambolle, De Castro, et al. 2018. “On Representer Theorems and Convex Regularization.”

*arXiv:1806.09810 [Cs, Math]*.
Boyer, Chambolle, de Castro, et al. 2018. “Convex Regularization and Representer Theorems.” In

*arXiv:1812.04355 [Cs, Math]*.
Chernozhukov, Newey, and Singh. 2018. “Learning L2 Continuous Regression Functionals via Regularized Riesz Representers.”

*arXiv:1809.05224 [Econ, Math, Stat]*.
Kar, and Karnick. 2012. “Random Feature Maps for Dot Product Kernels.” In

*Artificial Intelligence and Statistics*.
Kimeldorf, and Wahba. 1970. “A Correspondence Between Bayesian Estimation on Stochastic Processes and Smoothing by Splines.”

*The Annals of Mathematical Statistics*.
Schlegel. 2018. “When Is There a Representer Theorem? Reflexive Banach Spaces.”

*arXiv:1809.10284 [Cs, Math, Stat]*.
Schölkopf, Herbrich, and Smola. 2001. “A Generalized Representer Theorem.” In

*Computational Learning Theory*. Lecture Notes in Computer Science.
Unser. 2019. “A Representer Theorem for Deep Neural Networks.”

*Journal of Machine Learning Research*.
Walder, Schölkopf, and Chapelle. 2006. “Implicit Surface Modelling with a Globally Regularised Basis of Compact Support.”

*Computer Graphics Forum*.
Yu, Cheng, Schuurmans, et al. 2013. “Characterizing the Representer Theorem.” In

*Proceedings of the 30th International Conference on Machine Learning (ICML-13)*.