Graph neural nets



Neural networks applied to graph data. Neural networks of course can already be represented as directed graphs, or applied to phenomena which arise from a causal graph but that is not what we mean here. What we mean here is using information about graph topology as a feature input (and possibly output) for a neural network. In practice this is usually some variant of applying convnets to spectral graph representations; there is an obvious analogy to generic graph computations and belief propagation but I do not know how those play our in practice.

I am not closely following this area at the moment, so be aware content may not be current.

Tutorials

Pantelis Elinas wrote a good tutorial. One of his motivating examples is nifty: He argues graph learning is powerful because it includes the fundamental problem of discovering knowledge graphs and thus research discovery. He recommends the following summaries: Bronstein et al. (2017);Bronstein et al. (2021);Hamilton (2020);Hamilton, Ying, and Leskovec (2018);Xia et al. (2021).

Chami et al. (2022):

There has been a surge of recent interest in graph representation learning (GRL). GRL methods have generally fallen into three main categories, based on the availability of labeled data. The first, network embedding, focuses on learning unsupervised representations of relational structure. The second, graph regularized neural networks, leverages graphs to augment neural network losses with a regularization objective for semi-supervised learning. The third, graph neural networks, aims to learn differentiable functions over discrete topologies with arbitrary structure. However, despite the popularity of these areas there has been surprisingly little work on unifying the three paradigms. Here, we aim to bridge the gap between network embedding, graph regularization and graph neural networks. We propose a comprehensive taxonomy of GRL methods, aiming to unify several disparate bodies of work. Specifically, we propose the GraphEDM framework, which generalizes popular algorithms for semi-supervised learning (e.g. GraphSage, GCN, GAT), and unsupervised learning (e.g. DeepWalk, node2vec) of graph representations into a single consistent approach. To illustrate the generality of GraphEDM, we fit over thirty existing methods into this framework. We believe that this unifying view both provides a solid foundation for understanding the intuition behind these methods, and enables future research in the area.

Fun tweaks

Distance encoding:

Distance Encoding is a general class of graph-structure-related features that can be utilized by graph neural networks to improve the structural representation power. Given a node set whose structural representation is to be learnt, DE for a node over the graph is defined as a mapping of a set of landing probabilities of random walks from each node of the node set of interest to this node. Distance encoding generally includes measures such as shortest-path-distances and generalized PageRank scores. Distance encoding can be merged into the design of graph neural networks in simple but effective ways: First, we propose DEGNN that utilizes distance encoding as extra node features. We further enhance DEGNN by allowing distance encoding to control the aggregation procedure of traditional GNNs, which yields another model DEAGNN. Since distance encoding purely depends on the graph structure and is independent from node identifiers, it has inductive and generalization ability.

Tooling

pytorch geometric

deep graph library

Deep Graph Library:

Build your models with PyTorch, TensorFlow or Apache MXNet.

Fast and memory-efficient message passing primitives for training Graph Neural Networks. Scale to giant graphs via multi-GPU acceleration and distributed training infrastructure.

GTN

Facebook’s GTN:

GTN is an open source framework for automatic differentiation with a powerful, expressive type of graph called weighted finite-state transducers (WFSTs). ust as PyTorch provides a framework for automatic differentiation with tensors, GTN provides such a framework for WFSTs. AI researchers and engineers can use GTN to more effectively train graph-based machine learning models.

I have not used GTN so I cannot say if I have filed this item correctly or if it is more of a computational graph learning tool.

Background: Graph filtering

A lot of this seems to be based upon more classic linear systems theory applied to networks in the form of spectral graph theory. See signal processing on graphs.

Connection to graphical model learning

e.g. Yu et al. (2019).

References

Bacciu, Davide, Federico Errica, Alessio Micheli, and Marco Podda. 2020. A Gentle Introduction to Deep Learning for Graphs.” Neural Networks 129 (September): 203–21.
Bresson, Xavier, and Thomas Laurent. 2018. “An Experimental Study of Neural Networks for Variable Graphs,” 4.
Bronstein, Michael M., Joan Bruna, Taco Cohen, and Petar Veličković. 2021. Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges.” arXiv:2104.13478 [Cs, Stat], May.
Bronstein, Michael M., Joan Bruna, Yann LeCun, Arthur Szlam, and Pierre Vandergheynst. 2017. Geometric Deep Learning: Going Beyond Euclidean Data.” IEEE Signal Processing Magazine 34 (4): 18–42.
Bui, Thang D., Sujith Ravi, and Vivek Ramavajjala. 2017. Neural Graph Machines: Learning Neural Networks Using Graphs.” arXiv:1703.04818 [Cs], March.
Chami, Ines, Sami Abu-El-Haija, Bryan Perozzi, Christopher Ré, and Kevin Murphy. 2022. Machine Learning on Graphs: A Model and Comprehensive Taxonomy.” Journal of Machine Learning Research 23 (89): 1–64.
Cranmer, Miles D, Rui Xu, Peter Battaglia, and Shirley Ho. 2019. “Learning Symbolic Physics with Graph Networks.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS), 6.
Defferrard, Michaël, Xavier Bresson, and Pierre Vandergheynst. 2016. Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering.” In Advances In Neural Information Processing Systems.
Defferrard, Michaël, Martino Milani, Frédérick Gusset, and Nathanaël Perraudin. 2020. DeepSphere: A Graph-Based Spherical CNN.” arXiv:2012.15000 [Cs, Stat], December.
Defferrard, Michaël, Nathanaël Perraudin, Tomasz Kacprzak, and Raphael Sgier. 2019. DeepSphere: Towards an Equivariant Graph-Based Spherical CNN.” arXiv:1904.05146 [Cs, Stat], April.
Di Giovanni, Francesco, James Rowbottom, Benjamin P. Chamberlain, Thomas Markovich, and Michael M. Bronstein. 2022. Graph Neural Networks as Gradient Flows.” arXiv.
Dwivedi, Vijay Prakash, Chaitanya K. Joshi, Thomas Laurent, Yoshua Bengio, and Xavier Bresson. 2020. Benchmarking Graph Neural Networks.” arXiv:2003.00982 [Cs, Stat], July.
Garg, Vikas, Stefanie Jegelka, and Tommi Jaakkola. 2020. Generalization and Representational Limits of Graph Neural Networks.” In Proceedings of the 37th International Conference on Machine Learning, 3419–30. PMLR.
Hamilton, William L. 2020. Graph Representation Learning.” Synthesis Lectures on Artificial Intelligence and Machine Learning 14 (3): 1–159.
Hamilton, William L., Rex Ying, and Jure Leskovec. 2018. Representation Learning on Graphs: Methods and Applications.” arXiv:1709.05584 [Cs], April.
Hannun, Awni, Vineel Pratap, Jacob Kahn, and Wei-Ning Hsu. 2020. Differentiable Weighted Finite-State Transducers.” arXiv:2010.01003 [Cs, Stat], October.
Huang, Qian, Horace He, Abhay Singh, Ser-Nam Lim, and Austin R. Benson. 2020. Combining Label Propagation and Simple Models Out-Performs Graph Neural Networks.” arXiv:2010.13993 [Cs], November.
Isufi, Elvin, Andreas Loukas, Andrea Simonetto, and Geert Leus. 2017. Autoregressive Moving Average Graph Filtering.” IEEE Transactions on Signal Processing 65 (2): 274–88.
Jegelka, Stefanie. 2022. Theory of Graph Neural Networks: Representation and Learning,” April.
Lamb, Luis C., Artur Garcez, Marco Gori, Marcelo Prates, Pedro Avelar, and Moshe Vardi. 2020. Graph Neural Networks Meet Neural-Symbolic Computing: A Survey and Perspective.” In IJCAI 2020.
Lange, Henning, and J. Nathan Kutz. 2021. FC2T2: The Fast Continuous Convolutional Taylor Transform with Applications in Vision and Graphics.” arXiv:2111.00110 [Cs], November.
Li, Pan, Yanbang Wang, Hongwei Wang, and Jure Leskovec. 2020. Distance Encoding: Design Provably More Powerful Neural Networks for Graph Representation Learning.” arXiv:2009.00142 [Cs, Stat], October.
Ng, Ignavier, Zhuangyan Fang, Shengyu Zhu, Zhitang Chen, and Jun Wang. 2020. Masked Gradient-Based Causal Structure Learning.” arXiv:1910.08527 [Cs, Stat], February.
Ng, Ignavier, Shengyu Zhu, Zhitang Chen, and Zhuangyan Fang. 2019. A Graph Autoencoder Approach to Causal Structure Learning.” In Advances In Neural Information Processing Systems.
Passaro, Saro, and C. Lawrence Zitnick. 2023. Reducing SO(3) Convolutions to SO(2) for Efficient Equivariant GNNs.” arXiv.
Sanchez-Gonzalez, Alvaro, Victor Bapst, Peter Battaglia, and Kyle Cranmer. 2019. “Hamiltonian Graph Networks with ODE Integrators.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS), 11.
Shuman, D. I., S. K. Narang, P. Frossard, A. Ortega, and P. Vandergheynst. 2013. The Emerging Field of Signal Processing on Graphs: Extending High-Dimensional Data Analysis to Networks and Other Irregular Domains.” IEEE Signal Processing Magazine 30 (3): 83–98.
Shuman, David I., Pierre Vandergheynst, and Pascal Frossard. 2011. Chebyshev Polynomial Approximation for Distributed Signal Processing.” 2011 International Conference on Distributed Computing in Sensor Systems and Workshops (DCOSS), June, 1–8.
Tahmasebi, Behrooz, Derek Lim, and Stefanie Jegelka. 2020. Counting Substructures with Higher-Order Graph Neural Networks: Possibility and Impossibility Results,” December.
Xia, Feng, Ke Sun, Shuo Yu, Abdul Aziz, Liangtian Wan, Shirui Pan, and Huan Liu. 2021. Graph Learning: A Survey.” IEEE Transactions on Artificial Intelligence 2 (2): 109–27.
Yang, Zhenyu, Ge Zhang, Jia Wu, Jian Yang, Quan Z. Sheng, Shan Xue, Chuan Zhou, et al. 2023. A Comprehensive Survey of Graph-Level Learning.” arXiv.
Yu, Yue, Jie Chen, Tian Gao, and Mo Yu. 2019. DAG-GNN: DAG Structure Learning with Graph Neural Networks.” Proceedings of the 36th International Conference on Machine Learning. PMLR.
Zaheer, Manzil, Satwik Kottur, Siamak Ravanbakhsh, Barnabas Poczos, Russ R Salakhutdinov, and Alexander J Smola. 2017. Deep Sets.” In Advances in Neural Information Processing Systems. Vol. 30. Curran Associates, Inc.
Zambon, Daniele, Andrea Cini, Lorenzo Livi, and Cesare Alippi. 2023. Graph State-Space Models.” arXiv.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.