Self-supervised learning
I just wanna be meeeeee / with high probabilityyy ♬♪
2022-03-04 — 2022-03-04
Wherein a notebook on self-supervised learning is presented, with emphasis placed on contrastive learning methods, and illustrative notes and figures on transformed signals are supplied.
hidden variables
likelihood free
nonparametric
statistics
unsupervised
Notebook on an area about which I know little. Probably mostly notes on contrastive learning for now?
1 References
Balestriero, Ibrahim, Sobal, et al. 2023. “A Cookbook of Self-Supervised Learning.”
Chehab, Gramfort, and Hyvarinen. 2022. “The Optimal Noise in Noise-Contrastive Learning Is Not What You Think.” arXiv:2203.01110 [Cs, Stat].
Gutmann, Michael, and Hyvärinen. 2010. “Noise-Contrastive Estimation: A New Estimation Principle for Unnormalized Statistical Models.” In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics.
Gutmann, Michael U., and Hyvärinen. 2012. “Noise-Contrastive Estimation of Unnormalized Statistical Models, with Applications to Natural Image Statistics.” Journal of Machine Learning Research.
Le-Khac, Healy, and Smeaton. 2020. “Contrastive Representation Learning: A Framework and Review.” IEEE Access.
Ma, and Collins. 2018. “Noise Contrastive Estimation and Negative Sampling for Conditional Models: Consistency and Statistical Efficiency.” arXiv:1809.01812 [Cs, Stat].
Saunshi, Ash, Goel, et al. 2022. “Understanding Contrastive Learning Requires Incorporating Inductive Biases.” arXiv:2202.14037 [Cs].
Shwartz-Ziv, and LeCun. 2023. “To Compress or Not to Compress- Self-Supervised Learning and Information Theory: A Review.”
Smith, and Eisner. 2005. “Contrastive Estimation: Training Log-Linear Models on Unlabeled Data.” In Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics - ACL ’05.