Graph neural nets

September 16, 2020 — December 19, 2022

algebra
functional analysis
geometry
machine learning
networks
neural nets
Figure 1

Neural networks applied to graph data. Neural networks of course can already be represented as directed graphs, or applied to phenomena which arise from a causal graph but that is not what we mean here. What we mean here is using information about graph topology as a feature input (and possibly output) for a neural network. In practice this is usually some variant of applying convnets to spectral graph representations; there is an obvious analogy to generic graph computations and belief propagation but I do not know how those play our in practice.

I am not closely following this area at the moment, so be aware content may not be current.

1 Tutorials

Pantelis Elinas wrote a good tutorial. One of his motivating examples is nifty: He argues graph learning is powerful because it includes the fundamental problem of discovering knowledge graphs and thus research discovery. He recommends the following summaries: M. M. Bronstein et al. (2017);M. M. Bronstein et al. (2021);Hamilton (2020);Hamilton, Ying, and Leskovec (2018);Xia et al. (2021).

Chami et al. (2022):

There has been a surge of recent interest in graph representation learning (GRL). GRL methods have generally fallen into three main categories, based on the availability of labeled data. The first, network embedding, focuses on learning unsupervised representations of relational structure. The second, graph regularized neural networks, leverages graphs to augment neural network losses with a regularization objective for semi-supervised learning. The third, graph neural networks, aims to learn differentiable functions over discrete topologies with arbitrary structure. However, despite the popularity of these areas there has been surprisingly little work on unifying the three paradigms. Here, we aim to bridge the gap between network embedding, graph regularization and graph neural networks. We propose a comprehensive taxonomy of GRL methods, aiming to unify several disparate bodies of work. Specifically, we propose the GraphEDM framework, which generalizes popular algorithms for semi-supervised learning (e.g. GraphSage, GCN, GAT), and unsupervised learning (e.g. DeepWalk, node2vec) of graph representations into a single consistent approach. To illustrate the generality of GraphEDM, we fit over thirty existing methods into this framework. We believe that this unifying view both provides a solid foundation for understanding the intuition behind these methods, and enables future research in the area.

2 Fun tweaks

Distance encoding:

Distance Encoding is a general class of graph-structure-related features that can be utilized by graph neural networks to improve the structural representation power. Given a node set whose structural representation is to be learnt, DE for a node over the graph is defined as a mapping of a set of landing probabilities of random walks from each node of the node set of interest to this node. Distance encoding generally includes measures such as shortest-path-distances and generalized PageRank scores. Distance encoding can be merged into the design of graph neural networks in simple but effective ways: First, we propose DEGNN that utilizes distance encoding as extra node features. We further enhance DEGNN by allowing distance encoding to control the aggregation procedure of traditional GNNs, which yields another model DEAGNN. Since distance encoding purely depends on the graph structure and is independent from node identifiers, it has inductive and generalization ability.

3 Tooling

4 pytorch geometric

4.1 deep graph library

Deep Graph Library:

Build your models with PyTorch, TensorFlow or Apache MXNet.

Fast and memory-efficient message passing primitives for training Graph Neural Networks. Scale to giant graphs via multi-GPU acceleration and distributed training infrastructure.

4.2 GTN

Facebook’s GTN:

GTN is an open source framework for automatic differentiation with a powerful, expressive type of graph called weighted finite-state transducers (WFSTs). ust as PyTorch provides a framework for automatic differentiation with tensors, GTN provides such a framework for WFSTs. AI researchers and engineers can use GTN to more effectively train graph-based machine learning models.

I have not used GTN so I cannot say if I have filed this item correctly or if it is more of a computational graph learning tool.

5 Background: Graph filtering

A lot of this seems to be based upon more classic linear systems theory applied to networks in the form of spectral graph theory. See signal processing on graphs.

6 Connection to physics

Bronstein again, Beyond Message Passing, a Physics-Inspired Paradigm for Graph Neural Networks.

7 Connection to graphical model learning

e.g. Yu et al. (2019).

8 References

Bacciu, Errica, Micheli, et al. 2020. A Gentle Introduction to Deep Learning for Graphs.” Neural Networks.
Bresson, and Laurent. 2018. “An Experimental Study of Neural Networks for Variable Graphs.”
Bronstein, Michael. 2022. Beyond Message Passing: A Physics-Inspired Paradigm for Graph Neural Networks.” The Gradient.
Bronstein, Michael M., Bruna, Cohen, et al. 2021. Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges.” arXiv:2104.13478 [Cs, Stat].
Bronstein, Michael M., Bruna, LeCun, et al. 2017. Geometric Deep Learning: Going Beyond Euclidean Data.” IEEE Signal Processing Magazine.
Bui, Ravi, and Ramavajjala. 2017. Neural Graph Machines: Learning Neural Networks Using Graphs.” arXiv:1703.04818 [Cs].
Chami, Abu-El-Haija, Perozzi, et al. 2022. Machine Learning on Graphs: A Model and Comprehensive Taxonomy.” Journal of Machine Learning Research.
Cranmer, Xu, Battaglia, et al. 2019. “Learning Symbolic Physics with Graph Networks.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS).
Defferrard, Bresson, and Vandergheynst. 2016. Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering.” In Advances In Neural Information Processing Systems.
Defferrard, Milani, Gusset, et al. 2020. DeepSphere: A Graph-Based Spherical CNN.” arXiv:2012.15000 [Cs, Stat].
Defferrard, Perraudin, Kacprzak, et al. 2019. DeepSphere: Towards an Equivariant Graph-Based Spherical CNN.” arXiv:1904.05146 [Cs, Stat].
Di Giovanni, Rowbottom, Chamberlain, et al. 2022. Graph Neural Networks as Gradient Flows.”
Dwivedi, Joshi, Laurent, et al. 2020. Benchmarking Graph Neural Networks.” arXiv:2003.00982 [Cs, Stat].
Garg, Jegelka, and Jaakkola. 2020. Generalization and Representational Limits of Graph Neural Networks.” In Proceedings of the 37th International Conference on Machine Learning.
Hamilton. 2020. Graph Representation Learning.” Synthesis Lectures on Artificial Intelligence and Machine Learning.
Hamilton, Ying, and Leskovec. 2018. Representation Learning on Graphs: Methods and Applications.” arXiv:1709.05584 [Cs].
Hannun, Pratap, Kahn, et al. 2020. Differentiable Weighted Finite-State Transducers.” arXiv:2010.01003 [Cs, Stat].
Huang, He, Singh, et al. 2020. Combining Label Propagation and Simple Models Out-Performs Graph Neural Networks.” arXiv:2010.13993 [Cs].
Isufi, Loukas, Simonetto, et al. 2017. Autoregressive Moving Average Graph Filtering.” IEEE Transactions on Signal Processing.
Jegelka. 2022. Theory of Graph Neural Networks: Representation and Learning.”
Lamb, Garcez, Gori, et al. 2020. Graph Neural Networks Meet Neural-Symbolic Computing: A Survey and Perspective.” In IJCAI 2020.
Lam, Sanchez-Gonzalez, Willson, et al. 2023a. GraphCast: Learning Skillful Medium-Range Global Weather Forecasting.”
———, et al. 2023b. Learning Skillful Medium-Range Global Weather Forecasting.” Science.
Lange, and Kutz. 2021. FC2T2: The Fast Continuous Convolutional Taylor Transform with Applications in Vision and Graphics.” arXiv:2111.00110 [Cs].
Li, Wang, Wang, et al. 2020. Distance Encoding: Design Provably More Powerful Neural Networks for Graph Representation Learning.” arXiv:2009.00142 [Cs, Stat].
Ng, Fang, Zhu, et al. 2020. Masked Gradient-Based Causal Structure Learning.” arXiv:1910.08527 [Cs, Stat].
Ng, Zhu, Chen, et al. 2019. A Graph Autoencoder Approach to Causal Structure Learning.” In Advances In Neural Information Processing Systems.
Passaro, and Zitnick. 2023. Reducing SO(3) Convolutions to SO(2) for Efficient Equivariant GNNs.”
Sanchez-Gonzalez, Bapst, Battaglia, et al. 2019. “Hamiltonian Graph Networks with ODE Integrators.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS).
Shuman, D. I., Narang, Frossard, et al. 2013. The Emerging Field of Signal Processing on Graphs: Extending High-Dimensional Data Analysis to Networks and Other Irregular Domains.” IEEE Signal Processing Magazine.
Shuman, David I., Vandergheynst, and Frossard. 2011. Chebyshev Polynomial Approximation for Distributed Signal Processing.” 2011 International Conference on Distributed Computing in Sensor Systems and Workshops (DCOSS).
Tahmasebi, Lim, and Jegelka. 2020. Counting Substructures with Higher-Order Graph Neural Networks: Possibility and Impossibility Results.”
Xia, Sun, Yu, et al. 2021. Graph Learning: A Survey.” IEEE Transactions on Artificial Intelligence.
Yang, Zhang, Wu, et al. 2023. A Comprehensive Survey of Graph-Level Learning.”
Yu, Chen, Gao, et al. 2019. DAG-GNN: DAG Structure Learning with Graph Neural Networks.” Proceedings of the 36th International Conference on Machine Learning.
Zaheer, Kottur, Ravanbakhsh, et al. 2017. Deep Sets.” In Advances in Neural Information Processing Systems.
Zambon, Cini, Livi, et al. 2023. Graph State-Space Models.”