Spectral graph theory
Linear signals on graphs
November 24, 2014 — October 27, 2021
Placeholder for the application of classic linear systems theory applied to networks in the form of spectral graph theory, which I split off from the networks notebook, since that started to feel like algebraic graph methods, mostly. As used in graph neural networks.
I am interested in how filters can be defined on graphs; apparently they can?
Defferrard, Bresson, and Vandergheynst (2016) is credited with introducing localized Chebyshev filtering defined in terms of graph Laplacian to the neural nets. David I. Shuman, Vandergheynst, and Frossard (2011) seems to introduce this in a signal processing setting. (see overview in D. I. Shuman et al. (2013)). Isufi et al. (2017) defines an analogue of standard linear filters. There is a helpful visual comparison of both methods in Andreas Loukas’ blog post.
Many overlaps, different phrasing: matrix factorisation.
1 Link slurry
Calculate eigenvalues of your connectivity matrix and learn some stuff..
Or take the Fourier transform and also learn some stuff.
Kirchhoff’s Theorem gives us, roughly, that the number of spanning trees in a graph is equal to the determinant of any cofactor of the graph Laplacian matrix (where a cofactor is a matrix with row and column \(i\) deleted). Wow.
Spielman’s Laplacian Linear Equations, Graph Sparsification, Local Clustering, Low-Stretch Trees, etc. is the best start and links lots of online textbooks and so on. Is it the same as this other page? Laplacian Linear Equations, Graph Sparsification, Local Clustering, Low-Stretch Trees, etc.:
Shang-Hua Teng and I wrote a large paper on the problem of solving systems of linear equations in the Laplacian matrices of graphs. This paper required many graph-theoretic algorithms, most of which have been greatly improved. This page is an attempt to keep track of the major developments in and applications of these ideas.
2 Tooling
Laplacians.jl
(Julia):
Laplacians is a package containing graph algorithms, with an emphasis on tasks related to spectral and algebraic graph theory. It contains (and will contain more) code for solving systems of linear equations in graph Laplacians, low stretch spanning trees, sparsification, clustering, local clustering, and optimization on graphs.
All graphs are represented by sparse adjacency matrices. This is both for speed, and because our main concerns are algebraic tasks. It does not handle dynamic graphs. It would be very slow to implement dynamic graphs this way.
Much more tooling under graph neural networks.