HOWTOs
The internet is full of guides to training neural nets. Here are some selected highlights.
Michael Nielson has a free online textbook with code examples in python. Christopher Olah’s visual explanations make many things clear.
Andrej’s popular unromantic messy guide to training neural nets in practice has a lot of tips that people tend to rediscover the hard way if they do not get them from him. (I did)
It is allegedly easy to get started with training neural nets. Numerous libraries and frameworks take pride in displaying 30-line miracle snippets that solve your data problems, giving the (false) impression that this stuff is plug and play. … Unfortunately, neural nets are nothing like that. They are not “off-the-shelf” technology the second you deviate slightly from training an ImageNet classifier.
Profiling and performance optimisation
- google-research/tuning_playbook: A playbook for systematically maximizing the performance of deep learning models.
- Making Deep Learning go Brrrr From First Principles
- Monitor & Improve GPU Usage for Model Training on Weights & Biases
- Tracking system resource (GPU, CPU, etc.) utilization during training with the Weights & Biases Dashboard
- Algorithms for Modern Hardware - Algorithmica
- pytorch profilers
NN Software
I have used
- pytorch
- julia
- jax
- Occasionally, reluctantly, Tensorflow
I could use any of the other autodiff systems, such as…
- Theano (Python) (now defunct) was a trailblazer
- Torch (lua) —in practice deprecated in favour of pytorch
- Caffe was popular for a while; have not seen it recently (MATLAB/Python)
- Paddlepaddle is one of Baidu’s NN properties (Python/C++)
- mindspore is Huawei’s framework based on source transformation autodiff, targets interesting edge hardware.
- javascript: see javascript machine learning
- julia: Various autodiff and full-service ML tools.
Compiled
See edge ml for a discussion of compiled NNs.
Tracking experiments
Configuring experiments
See configuring experiments; in practice I use hydra for everything.
pre-computed/trained models
Caffe format:
The Caffe Zoo has lots of nice models, pre-trained on their wiki
Here’s a great CV one, Andrej Karpathy’s image captioner, Neuraltalk2
for the NVC dataset: — pre-trained feature model here)
For lasagne: https://github.com/Lasagne/Recipes/tree/master/modelzoo
For Keras:
Managing axes
A lot of the time managing deep learning is remembering which axis is which. Practically, I have found Einstein convention to solve all my needs.
However, there are alternatives. Alexander Rush argues for NamedTensor. Implementations:
- Native Pytorch
- namedtensor pytorch
- labeledtensor tensorflow
No comments yet. Why not leave one?