Convolutional neural networks

The network topology that more or less kicked off the current revolution in computer vision and thus the whole modern neural network craze.

Convolutional nets (convnets or CNNs to the suave) are well described elsewhere. I’m going to collect some choice morsels here. Classic signal processing baked in to neural networks.

There is a long story about how convolutions naturally encourage certain invariances and symmetries, although AFAICT it’s all somewhat hand-wavey.

Generally uses FIR filters plus some smudgy β€œpooling”.


Here is a visualisations of convolutions: vdumoulin/conv_arithmetic

Visualising the actual activations of a convnet is an interesting data visualisation challenge, since intermediate activations often end up being high-rank tensors, but they have a lot of regularity that can be exploited to it feels like it should be feasible.

Terence Broad’s convnet visualizer

Connection to filter theory

TBC. For now work it out from other signal processing link material.


Interesting, and they pop up in fun places like Dynamical models of neural nets. TBD.


Defferrard, MichaΓ«l, Xavier Bresson, and Pierre Vandergheynst. 2016. β€œConvolutional Neural Networks on Graphs with Fast Localized Spectral Filtering.” In Advances In Neural Information Processing Systems.
Gonzalez, R. C. 2018. β€œDeep Convolutional Neural Networks [Lecture Notes].” IEEE Signal Processing Magazine 35 (6): 79–87.
Gu, Albert, Isys Johnson, Karan Goel, Khaled Saab, Tri Dao, Atri Rudra, and Christopher RΓ©. 2021. β€œCombining Recurrent, Convolutional, and Continuous-Time Models with Linear State Space Layers.” In Advances in Neural Information Processing Systems, 34:572–85. Curran Associates, Inc.
Kipf, Thomas N., and Max Welling. 2016. β€œSemi-Supervised Classification with Graph Convolutional Networks.” In arXiv:1609.02907 [Cs, Stat].
Krizhevsky, Alex, Ilya Sutskever, and Geoffrey E. Hinton. 2012. β€œImagenet Classification with Deep Convolutional Neural Networks.” In Advances in Neural Information Processing Systems, 1097–1105.
Kulkarni, Tejas D., Will Whitney, Pushmeet Kohli, and Joshua B. Tenenbaum. 2015. β€œDeep Convolutional Inverse Graphics Network.” arXiv:1503.03167 [Cs], March.
LeCun, Yann, Yoshua Bengio, and Geoffrey Hinton. 2015. β€œDeep Learning.” Nature 521 (7553): 436–44.
Lee, Honglak, Roger Grosse, Rajesh Ranganath, and Andrew Y. Ng. 2009. β€œConvolutional Deep Belief Networks for Scalable Unsupervised Learning of Hierarchical Representations.” In Proceedings of the 26th Annual International Conference on Machine Learning, 609–16. ICML ’09. New York, NY, USA: ACM.
Lin, Jinhua, and Yu Yao. 2019. β€œA Fast Algorithm for Convolutional Neural Networks Using Tile-Based Fast Fourier Transforms.” Neural Processing Letters 50 (2): 1951–67.
Mallat, StΓ©phane. 2016. β€œUnderstanding Deep Convolutional Networks.” arXiv:1601.04920 [Cs, Stat], January.
Mousavi, Ali, and Richard G. Baraniuk. 2017. β€œLearning to Invert: Signal Recovery via Deep Convolutional Networks.” In ICASSP.
Passaro, Saro, and C. Lawrence Zitnick. 2023. β€œReducing SO(3) Convolutions to SO(2) for Efficient Equivariant GNNs.” arXiv.
Paul, Arnab, and Suresh Venkatasubramanian. 2014. β€œWhy Does Deep Learning Work? - A Perspective from Group Theory.” arXiv:1412.6621 [Cs, Stat], December.
Rawat, Waseem, and Zenghui Wang. 2017. β€œDeep Convolutional Neural Networks for Image Classification: A Comprehensive Review.” Neural Computation 29 (9): 2352–2449.
Ronneberger, Olaf, Philipp Fischer, and Thomas Brox. 2015. β€œU-Net: Convolutional Networks for Biomedical Image Segmentation.” Edited by Nassir Navab, Joachim Hornegger, William M. Wells, and Alejandro F. Frangi. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015. Lecture Notes in Computer Science. Cham: Springer International Publishing.
Shelhamer, Evan, Jonathan Long, and Trevor Darrell. 2017. β€œFully Convolutional Networks for Semantic Segmentation.” IEEE Transactions on Pattern Analysis and Machine Intelligence 39 (4): 640–51.
Springenberg, Jost Tobias, Alexey Dosovitskiy, Thomas Brox, and Martin Riedmiller. 2014. β€œStriving for Simplicity: The All Convolutional Net.” In Proceedings of International Conference on Learning Representations (ICLR) 2015.
Urban, Gregor, Krzysztof J. Geras, Samira Ebrahimi Kahou, Ozlem Aslan, Shengjie Wang, Rich Caruana, Abdelrahman Mohamed, Matthai Philipose, and Matt Richardson. 2016. β€œDo Deep Convolutional Nets Really Need to Be Deep (Or Even Convolutional)?” arXiv:1603.05691 [Cs, Stat], March.
Wang, Yunhe, Chang Xu, Chao Xu, and Dacheng Tao. 2019. β€œPacking Convolutional Neural Networks in the Frequency Domain.” IEEE transactions on pattern analysis and machine intelligence 41 (10): 2495–2510.
Wiatowski, Thomas, and Helmut BΓΆlcskei. 2015. β€œA Mathematical Theory of Deep Convolutional Neural Networks for Feature Extraction.” In Proceedings of IEEE International Symposium on Information Theory.
Wiatowski, Thomas, Philipp Grohs, and Helmut BΓΆlcskei. 2018. β€œEnergy Propagation in Deep Convolutional Neural Networks.” IEEE Transactions on Information Theory 64 (7): 1–1.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.