Bayes NNs where only some weights are random and others are fixed. This raises various difficulties β how to you update a fixed parameter?
Is this even principled?
Sharma et al. (2022)
How to update a deterministic parameter?
FRom the perspective of bayes, parameters we do not update have zero variance. And yet we do update them by SGD. What does that mean? How can we make that statistically well-posed?
Last layer
The most famous one. See Bayes last layer.
References
Daxberger, Erik, Eric Nalisnick, James U. Allingham, Javier Antoran, and Jose Miguel Hernandez-Lobato. 2021. βBayesian Deep Learning via Subnetwork Inference.β In Proceedings of the 38th International Conference on Machine Learning, 2510β21. PMLR.
Daxberger, Erik, Eric Nalisnick, James Urquhart Allingham, Javier Antoran, and Jose Miguel Hernandez-Lobato. 2020. βExpressive yet Tractable Bayesian Deep Learning via Subnetwork Inference.β In.
Izmailov, Pavel, Wesley J. Maddox, Polina Kirichenko, Timur Garipov, Dmitry Vetrov, and Andrew Gordon Wilson. 2020. βSubspace Inference for Bayesian Deep Learning.β In Proceedings of The 35th Uncertainty in Artificial Intelligence Conference, 1169β79. PMLR.
Ke, Xiongwen, and Yanan Fan. 2022. βOn the Optimization and Pruning for Bayesian Deep Learning.β arXiv.
Kowal, Daniel R. 2022. βBayesian Subset Selection and Variable Importance for Interpretable Prediction and Classification.β arXiv.
Sharma, Mrinank, Sebastian Farquhar, Eric Nalisnick, and Tom Rainforth. 2022. βDo Bayesian Neural Networks Need To Be Fully Stochastic?β arXiv.
Tran, M.-N., N. Nguyen, D. Nott, and R. Kohn. 2019. βBayesian Deep Net GLM and GLMM.β Journal of Computational and Graphical Statistics 29 (ja): 1β40.
No comments yet. Why not leave one?