Normalising flows for PDE learning.

Figure 1

Lipman et al. () seems to be the origin point, extended by Kerrigan, Migliorini, and Smyth () to function-valued PDEs.

Figure 2: An illustration of our FFM method. The vector field vt(f)F (in black) transforms a noise sample gμ0=N(0,C0) drawn from a Gaussian process with a Matérn kernel (at t=0 ) to the function f(x)=sin(x) (at t=1 ) via solving a function space ODE. By sampling many such gμ0, we define a conditional path of measures μtf approximately interpolating between N(0,C0) and the function f, which we marginalize over samples fν from the data distribution in order to obtain a path of measures approximately interpolating between μ0 and ν. ()

1 References

Cheng, Han, Maddix, et al. 2024. Hard Constraint Guided Flow Matching for Gradient-Free Generation of PDE Solutions.”
Kerrigan, Migliorini, and Smyth. 2024. Functional Flow Matching.” In Proceedings of The 27th International Conference on Artificial Intelligence and Statistics.
———. 2025. Dynamic Conditional Optimal Transport Through Simulation-Free Flows.” In Advances in Neural Information Processing Systems.
Lipman, Chen, Ben-Hamu, et al. 2023. Flow Matching for Generative Modeling.” In.
Liu, Gong, and Liu. 2022. Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow.” In.
Shi, Gao, Ross, et al. 2024. Universal Functional Regression with Neural Operator Flows.” In Transactions on Machine Learning Research.