Graph neural nets

Neural networks applied to graph data. (Neural networks of course can already be represented as directed graphs, or applied to phenomena which arise from a causal graph but that is not what we mean here.

The version of graphical neural nets with which I am familiar is applying convnets to spectral graph representations. e.g. Thomas Kipf summarises research there.

I gather that the field has moved on and I am no longer across what is happening.

See perhaps Xavier Besson’s implementation of one graph convnet. Maybe check out Chaitanya Joshi’s overviews of spatial graph convnets and his attempt to link them to attention mechanisms.

Facebook’s GTN might be a tool here:

GTN is an open source framework for automatic differentiation with a powerful, expressive type of graph called weighted finite-state transducers (WFSTs). Just as PyTorch provides a framework for automatic differentiation with tensors, GTN provides such a framework for WFSTs. AI researchers and engineers can use GTN to more effectively train graph-based machine learning models.

I have not used GTN so I cannot say if I have field it correctly or if it is more of a computational graph learning tool.

distance encoding:

Distance Encoding is a general class of graph-structure-related features that can be utilized by graph neural networks to improve the structural representation power. Given a node set whose structural representation is to be learnt, DE for a node over the graph is defined as a mapping of a set of landing probabilities of random walks from each node of the node set of interest to this node. Distance encoding generally includes measures such as shortest-path-distances and generalized PageRank scores. Distance encoding can be merged into the design of graph neural networks in simple but effective ways: First, we propose DEGNN that utilizes distance encoding as extra node features. We further enhance DEGNN by allowing distance encoding to control the aggregation procedure of traditional GNNs, which yields another model DEAGNN. Since distance encoding purely depends on the graph structure and is independent from node identifiers, it has inductive and generalization ability.


Bresson, Xavier, and Thomas Laurent. 2018. “An Experimental Study of Neural Networks for Variable Graphs,” 4.
Bronstein, Michael M., Joan Bruna, Yann LeCun, Arthur Szlam, and Pierre Vandergheynst. 2017. “Geometric Deep Learning: Going Beyond Euclidean Data.” IEEE Signal Processing Magazine 34 (4): 18–42.
Bui, Thang D., Sujith Ravi, and Vivek Ramavajjala. 2017. “Neural Graph Machines: Learning Neural Networks Using Graphs.” arXiv:1703.04818 [cs], March.
Cranmer, Miles D, Rui Xu, Peter Battaglia, and Shirley Ho. 2019. “Learning Symbolic Physics with Graph Networks.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS), 6.
Defferrard, Michaël, Xavier Bresson, and Pierre Vandergheynst. 2016. “Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering.” In Advances In Neural Information Processing Systems.
Dwivedi, Vijay Prakash, Chaitanya K. Joshi, Thomas Laurent, Yoshua Bengio, and Xavier Bresson. 2020. “Benchmarking Graph Neural Networks.” arXiv:2003.00982 [cs, Stat], July.
Hannun, Awni, Vineel Pratap, Jacob Kahn, and Wei-Ning Hsu. 2020. “Differentiable Weighted Finite-State Transducers.” arXiv:2010.01003 [cs, Stat], October.
Huang, Qian, Horace He, Abhay Singh, Ser-Nam Lim, and Austin R. Benson. 2020. “Combining Label Propagation and Simple Models Out-Performs Graph Neural Networks.” arXiv:2010.13993 [cs], November.
Lamb, Luis C., Artur Garcez, Marco Gori, Marcelo Prates, Pedro Avelar, and Moshe Vardi. 2020. “Graph Neural Networks Meet Neural-Symbolic Computing: A Survey and Perspective.” In IJCAI 2020.
Li, Pan, Yanbang Wang, Hongwei Wang, and Jure Leskovec. 2020. “Distance Encoding: Design Provably More Powerful Neural Networks for Graph Representation Learning.” arXiv:2009.00142 [cs, Stat], October.
Ng, Ignavier, Zhuangyan Fang, Shengyu Zhu, Zhitang Chen, and Jun Wang. 2020. “Masked Gradient-Based Causal Structure Learning.” arXiv:1910.08527 [cs, Stat], February.
Ng, Ignavier, Shengyu Zhu, Zhitang Chen, and Zhuangyan Fang. 2019. “A Graph Autoencoder Approach to Causal Structure Learning.” In Advances In Neural Information Processing Systems.
Sanchez-Gonzalez, Alvaro, Victor Bapst, Peter Battaglia, and Kyle Cranmer. 2019. “Hamiltonian Graph Networks with ODE Integrators.” In Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS), 11.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.