Placeholder for the application of classic linear systems theory applied to networks in the form of spectral graph theory, which I split off from the networks notebook, since that started to feel like algebraic graph methods, mostly. As used in graph nueral networks.

I am interested in how filters can be defined on graphs; apparently they can?

Defferrard, Bresson, and Vandergheynst (2016) is credited with introducing *localized* Cheybshev filtering defined in terms of graph Laplacian to the neural nets.
David I. Shuman, Vandergheynst, and Frossard (2011) seems to introduce this in a signal processing setting. (see overview in D. I. Shuman et al. (2013)).
Isufi et al. (2017) defines an analogue of standard linear filters.
There is a helpful visual comparison of both methods in Andreas Loukasβ blog post.

Many overlaps, different phrasing: matrix factorisation.

## Link slurry

Calculate eigenvalues of your connectivity matrix and learn some stuff..

Or take the Fourier transform and also learn some stuff.

Kirchhoffβs Theorem gives us, roughly, that the number of spanning trees in a graph is equal to the determinant of any cofactor of the graph Laplacian matrix (where a cofactor is a matrix with row and column \(i\) deleted). Wow.

Spielmanβs Laplacian Linear Equations, Graph Sparsification, Local Clustering, Low-Stretch Trees, etc. is the best start and links lots of online textbooks and so on. Is it the same as this other page? Laplacian Linear Equations, Graph Sparsification, Local Clustering, Low-Stretch Trees, etc.:

Shang-Hua Teng and I wrote a large paper on the problem of solving systems of linear equations in the Laplacian matrices of graphs. This paper required many graph-theoretic algorithms, most of which have been greatly improved. This page is an attempt to keep track of the major developments in and applications of these ideas.

## Tooling

`Laplacians.jl`

(Julia):

Laplacians is a package containing graph algorithms, with an emphasis on tasks related to spectral and algebraic graph theory. It contains (and will contain more) code for solving systems of linear equations in graph Laplacians, low stretch spanning trees, sparsifiation, clustering, local clustering, and optimization on graphs.

All graphs are represented by sparse adjacency matrices. This is both for speed, and because our main concerns are algebraic tasks. It does not handle dynamic graphs. It would be very slow to implement dynamic graphs this way.

Much more tooling under graph neural networks.

## References

*Neural Computation*15 (6): 1373β96.

*Handbook of Linear Algebra*, 13.

*Spectral Graph Theory*. American Mathematical Soc.

*Advances In Neural Information Processing Systems*.

*The Annals of Applied Probability*1 (1): 36β61.

*Proceedings of the 2005 SIAM International Conference on Data Mining*, 606β10. Proceedings. Society for Industrial and Applied Mathematics.

*Handbook of Linear Algebra*. Second edition. Discrete Mathematics and Its Applications. Boca Raton, Florida: CRC Press/Taylor & Francis Group.

*IEEE Transactions on Signal Processing*65 (2): 274β88.

*arXiv:1609.02907 [Cs, Stat]*.

*arXiv Preprint arXiv:1608.04845*.

*Proceedings of the National Academy of Sciences*103 (23): 8577β82.

*IEEE Signal Processing Magazine*30 (3): 83β98.

*2011 International Conference on Distributed Computing in Sensor Systems and Workshops (DCOSS)*, June, 1β8.

*Physical Review E*88 (3): 032807.

*arXiv:0808.4134 [Cs]*, August.

*arXiv:0809.3232 [Cs]*, September.

*IEEE Transactions on Artificial Intelligence*2 (2): 109β27.

## No comments yet. Why not leave one?