Semi/weakly-supervised learning

On extracting nutrition from bullshit

July 24, 2017 — July 5, 2021

density
hidden variables
statistics
unsupervised

I’m not yet sure what this is, but I’ve seen these words invoked in machine learning problems with a partially-observed model, where you hope to simultaneously learn the parameters of the label generation process and the observation process: So if I have a bunch of crowd-sourced labels for my data and I wish to use them to train a classifier, but I suspect that my crowd is a little unreliable, then I try to do “weakly supervised” learning when I learn both the true labels and the crowd whimsy process, as a kind of hierarchical model of informative sampling. Or I might assume no explicit model for the crowd whimsy, but simply that similar data should not be too differently labelled, a.k.a. Label Propagation, which uses graph clustering to infer data labels.

Other methods?

Figure 1

1 Self supervision

See self-supervised learning.

2 Data augmentation

Here are some tools.

snorkel:

Snorkel is a system for rapidly creating, modelling, and managing training data, currently focused on accelerating the development of structured or “dark” data extraction applications for domains in which large labelled training sets are not available or easy to obtain.

Today’s state-of-the-art machine learning models require massive labelled training sets — which usually do not exist for real-world applications. Instead, Snorkel is based around the new data programming paradigm, in which the developer focuses on writing a set of labelling functions, which are just scripts that programmatically label data. The resulting labels are noisy, but Snorkel automatically models this process—learning, essentially, which labelling functions are more accurate than others—and then uses this to train an end model (for example, a deep neural network in TensorFlow).

Surprisingly, by modelling a noisy training set creation process in this way, we can take potentially low-quality labelling functions from the user, and use these to train high-quality end models. We see Snorkel as providing a general framework for many weak supervision techniques, and as defining a new programming model for weakly-supervised machine learning systems.

3 References

Bach, He, Ratner, et al. 2017. Learning the Structure of Generative Models Without Labeled Data.” In Proceedings of the 34th International Conference on Machine Learning.
Bardes, Ponce, and LeCun. 2022. VICReg: Variance-Invariance-Covariance Regularization for Self-Supervised Learning.”
Chapelle, Schölkopf, and Zien, eds. 2010. Semi-Supervised Learning. Adaptive Computation and Machine Learning.
Delalleau, Bengio, and Le Roux. 2005. Efficient Nonparametric Function Induction in Semi-Supervised Learning.” In In Proc. Artificial Intelligence and Statistics.
Fonseca, Plakal, Ellis, et al. 2019. Learning Sound Event Classifiers from Web Audio with Noisy Labels.” arXiv:1901.01189 [Cs, Eess, Stat].
Jung, Hero III, Mara, et al. 2016. Semi-Supervised Learning via Sparse Label Propagation.” arXiv:1612.01414 [Cs, Stat].
Karpathy, Toderici, Shetty, et al. 2014. Large-Scale Video Classification with Convolutional Neural Networks.” In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition. CVPR ’14.
Kügelgen, Mey, Loog, et al. 2020. Semi-Supervised Learning, Causality, and the Conditional Cluster Assumption.” In Conference on Uncertainty in Artificial Intelligence.
Kumar, and Raj. 2016. Audio Event Detection Using Weakly Labeled Data.” In Proceedings of the 2016 ACM on Multimedia Conference. MM ’16.
———. 2017. Deep CNN Framework for Audio Event Recognition Using Weakly Labeled Web Data.” arXiv:1707.02530 [Cs].
Li, and Tang. 2015. Weakly Supervised Deep Metric Learning for Community-Contributed Image Retrieval.” IEEE Transactions on Multimedia.
Misra, Zitnick, Mitchell, et al. 2015. Seeing Through the Human Reporting Bias: Visual Classifiers from Noisy Human-Centric Labels.” In Proceedings of CVPR.
Ouali, Hudelot, and Tami. 2020. An Overview of Deep Semi-Supervised Learning.”
Papandreou, Chen, Murphy, et al. n.d. Weakly-and Semi-Supervised Learning of a Deep Convolutional Network for Semantic Image Segmentation.”
Ratner, Alexander, Bach, Ehrenberg, et al. 2017. Snorkel: Rapid Training Data Creation with Weak Supervision.” Proceedings of the VLDB Endowment.
Ratner, Alexander J, De Sa, Wu, et al. 2016. Data Programming: Creating Large Training Sets, Quickly.” In Advances in Neural Information Processing Systems 29.
Varma, He, Bajaj, et al. 2017. Inferring Generative Model Structure with Static Analysis.” In Advances In Neural Information Processing Systems.
Wu, Wang, Zhang, et al. 2015. Weakly Semi-Supervised Deep Learning for Multi-Label Image Annotation.” IEEE Transactions on Big Data.
Zhou, Bousquet, Lal, et al. 2003. Learning with Local and Global Consistency.” In Proceedings of the 16th International Conference on Neural Information Processing Systems. NIPS’03.
Zhu, and Ghahramani. 2002. Learning from Labeled and Unlabeled Data with Label Propagation.”