Convolutional neural networks

November 10, 2017 — November 21, 2021

Figure 1

The network topology that more or less kicked off the current revolution in computer vision and thus the whole modern neural network craze.

Convolutional nets (convnets or CNNs to the suave) are well described elsewhere. I’m going to collect some choice morsels here. Classic signal processing baked in to neural networks.

There is a long story about how convolutions naturally encourage certain invariances and symmetries, although AFAICT it’s all somewhat hand-wavey.

Generally uses FIR filters plus some smudgy “pooling”.

1 Visualising

Figure 2

Here is a visualisations of convolutions: vdumoulin/conv_arithmetic

Visualising the actual activations of a convnet is an interesting data visualisation challenge, since intermediate activations often end up being high-rank tensors, but they have a lot of regularity that can be exploited to it feels like it should be feasible.

Figure 3: Terence Broad’s convnet visualizer

2 Connection to filter theory

TBC. For now work it out from other signal processing link material.

3 Resnets

Interesting, and they pop up in fun places like Dynamical models of neural nets. TBD.

Figure 4

4 References

Defferrard, Bresson, and Vandergheynst. 2016. Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering.” In Advances In Neural Information Processing Systems.
Gonzalez. 2018. Deep Convolutional Neural Networks [Lecture Notes].” IEEE Signal Processing Magazine.
Gu, Johnson, Goel, et al. 2021. Combining Recurrent, Convolutional, and Continuous-Time Models with Linear State Space Layers.” In Advances in Neural Information Processing Systems.
Kipf, and Welling. 2016. Semi-Supervised Classification with Graph Convolutional Networks.” In arXiv:1609.02907 [Cs, Stat].
Krizhevsky, Sutskever, and Hinton. 2012. Imagenet Classification with Deep Convolutional Neural Networks.” In Advances in Neural Information Processing Systems.
Kulkarni, Whitney, Kohli, et al. 2015. Deep Convolutional Inverse Graphics Network.” arXiv:1503.03167 [Cs].
LeCun, Bengio, and Hinton. 2015. Deep Learning.” Nature.
Lee, Grosse, Ranganath, et al. 2009. Convolutional Deep Belief Networks for Scalable Unsupervised Learning of Hierarchical Representations.” In Proceedings of the 26th Annual International Conference on Machine Learning. ICML ’09.
Lin, and Yao. 2019. A Fast Algorithm for Convolutional Neural Networks Using Tile-Based Fast Fourier Transforms.” Neural Processing Letters.
Mallat. 2016. Understanding Deep Convolutional Networks.” arXiv:1601.04920 [Cs, Stat].
Mousavi, and Baraniuk. 2017. Learning to Invert: Signal Recovery via Deep Convolutional Networks.” In ICASSP.
Passaro, and Zitnick. 2023. Reducing SO(3) Convolutions to SO(2) for Efficient Equivariant GNNs.”
Paul, and Venkatasubramanian. 2014. Why Does Deep Learning Work? - A Perspective from Group Theory.” arXiv:1412.6621 [Cs, Stat].
Rawat, and Wang. 2017. Deep Convolutional Neural Networks for Image Classification: A Comprehensive Review.” Neural Computation.
Ronneberger, Fischer, and Brox. 2015. U-Net: Convolutional Networks for Biomedical Image Segmentation.” Edited by Nassir Navab, Joachim Hornegger, William M. Wells, and Alejandro F. Frangi. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015. Lecture Notes in Computer Science.
Shelhamer, Long, and Darrell. 2017. Fully Convolutional Networks for Semantic Segmentation.” IEEE Transactions on Pattern Analysis and Machine Intelligence.
Springenberg, Dosovitskiy, Brox, et al. 2014. Striving for Simplicity: The All Convolutional Net.” In Proceedings of International Conference on Learning Representations (ICLR) 2015.
Urban, Geras, Kahou, et al. 2016. Do Deep Convolutional Nets Really Need to Be Deep (Or Even Convolutional)? arXiv:1603.05691 [Cs, Stat].
Wang, Xu, Xu, et al. 2019. Packing Convolutional Neural Networks in the Frequency Domain.” IEEE transactions on pattern analysis and machine intelligence.
Wiatowski, and Bölcskei. 2015. A Mathematical Theory of Deep Convolutional Neural Networks for Feature Extraction.” In Proceedings of IEEE International Symposium on Information Theory.
Wiatowski, Grohs, and Bölcskei. 2018. Energy Propagation in Deep Convolutional Neural Networks.” IEEE Transactions on Information Theory.