Deep Gaussian process regression



Gaussian process layer cake.

Platonic ideal

TBD.

Approximation with dropout

See NN ensembles.

References

Cutajar, Kurt, Edwin V. Bonilla, Pietro Michiardi, and Maurizio Filippone. 2017. “Random Feature Expansions for Deep Gaussian Processes.” In PMLR. http://proceedings.mlr.press/v70/cutajar17a.html.
Damianou, Andreas, and Neil Lawrence. 2013. “Deep Gaussian Processes.” In Artificial Intelligence and Statistics, 207–15. PMLR. http://proceedings.mlr.press/v31/damianou13a.html.
Domingos, Pedro. 2020. “Every Model Learned by Gradient Descent Is Approximately a Kernel Machine.” November 30, 2020. http://arxiv.org/abs/2012.00152.
Dunlop, Matthew M., Mark A. Girolami, Andrew M. Stuart, and Aretha L. Teckentrup. 2018. “How Deep Are Deep Gaussian Processes?” Journal of Machine Learning Research 19 (1): 2100–2145. http://jmlr.org/papers/v19/18-015.html.
Dutordoir, Vincent, James Hensman, Mark van der Wilk, Carl Henrik Ek, Zoubin Ghahramani, and Nicolas Durrande. 2021. “Deep Neural Networks as Point Estimates for Deep Gaussian Processes.” May 10, 2021. http://arxiv.org/abs/2105.04504.
Gal, Yarin, and Zoubin Ghahramani. 2015. “Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning.” In Proceedings of the 33rd International Conference on Machine Learning (ICML-16). http://arxiv.org/abs/1506.02142.
———. 2016. “A Theoretically Grounded Application of Dropout in Recurrent Neural Networks.” In. http://arxiv.org/abs/1512.05287.
Jankowiak, Martin, Geoff Pleiss, and Jacob Gardner. 2020. “Deep Sigma Point Processes.” In Conference on Uncertainty in Artificial Intelligence, 789–98. PMLR. http://proceedings.mlr.press/v124/jankowiak20a.html.
Kingma, Diederik P., Tim Salimans, and Max Welling. 2015. “Variational Dropout and the Local Reparameterization Trick.” In Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 2, 2575–83. NIPS’15. Cambridge, MA, USA: MIT Press. http://arxiv.org/abs/1506.02557.
Lee, Jaehoon, Yasaman Bahri, Roman Novak, Samuel S. Schoenholz, Jeffrey Pennington, and Jascha Sohl-Dickstein. 2018. “Deep Neural Networks as Gaussian Processes.” In ICLR. http://arxiv.org/abs/1711.00165.
Leibfried, Felix, Vincent Dutordoir, S. T. John, and Nicolas Durrande. 2021. “A Tutorial on Sparse Gaussian Processes and Variational Inference.” June 11, 2021. http://arxiv.org/abs/2012.13962.
Mattos, César Lincoln C., Zhenwen Dai, Andreas Damianou, Guilherme A. Barreto, and Neil D. Lawrence. 2017. “Deep Recurrent Gaussian Processes for Outlier-Robust System Identification.” Journal of Process Control, DYCOPS-CAB 2016, 60 (December): 82–94. https://doi.org/10.1016/j.jprocont.2017.06.010.
Molchanov, Dmitry, Arsenii Ashukha, and Dmitry Vetrov. 2017. “Variational Dropout Sparsifies Deep Neural Networks.” In Proceedings of ICML. http://arxiv.org/abs/1701.05369.
Salimbeni, Hugh, and Marc Deisenroth. 2017. “Doubly Stochastic Variational Inference for Deep Gaussian Processes.” In Advances In Neural Information Processing Systems. http://arxiv.org/abs/1705.08933.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.