# Learning with PDE conservation laws

October 15, 2019 — June 3, 2024

calculus

dynamical systems

geometry

Hilbert space

how do science

Lévy processes

machine learning

neural nets

PDEs

physics

regression

sciml

SDEs

signal processing

statistics

statmech

stochastic processes

surrogate

time series

uncertainty

Unlike PINNs which penalise deviation from conservation laws in the loss, we can impose symmetries in our neural net architecture itself.

TBD

## 1 References

Bloem-Reddy, and Teh. 2020. “Probabilistic Symmetries and Invariant Neural Networks.”

Di Giovanni, Rowbottom, Chamberlain, et al. 2022. “Graph Neural Networks as Gradient Flows.”

Pestourie, Mroueh, Rackauckas, et al. 2022. “Physics-Enhanced Deep Surrogates for PDEs.”

Rezende, Racanière, Higgins, et al. 2019. “Equivariant Hamiltonian Flows.” In

*Machine Learning and the Physical Sciences Workshop at the 33rd Conference on Neural Information Processing Systems (NeurIPS)*.
Ruhe, Gupta, de Keninck, et al. 2023. “Geometric Clifford Algebra Networks.” In

*arXiv Preprint arXiv:2302.06594*.
Smets, Portegies, Bekkers, et al. 2023. “PDE-Based Group Equivariant Convolutional Neural Networks.”

*Journal of Mathematical Imaging and Vision*.