A useful differentiable PDE solver.
PhiFlow: A differentiable PDE solving framework for machine learning (Holl et al. 2020):
- Variety of built-in PDE operations with focus on fluid phenomena, allowing for concise formulation of simulations.
- Tight integration with PyTorch, Jax and TensorFlow for straightforward neural network training with fully differentiable simulations that can run on the GPU.
- Flexible, easy-to-use web interface featuring live visualizations and interactive controls that can affect simulations or network training on the fly.
- Object-oriented, vectorized design for expressive code, ease of use, flexibility and extensibility.
- Reusable simulation code, independent of backend and dimensionality, i.e. the exact same code can run a 2D fluid sim using NumPy and a 3D fluid sim on the GPU using TensorFlow or PyTorch.
- High-level linear equation solver with automated sparse matrix generation.
Phiflow seems to have less elaborate PDEs built-in than Mantaflow but have deeper (?)/more flexible (?) ML integration and more active development (?). As seen in various papers from the TUM group (Holl, Thuerey, and Koltun 2020; Um and Holl 2021; Um et al. 2021).
It is a lovely package in its way; that way is quirky, hipster and artisinal. The documentation is dispersed and confusing, scattered across video tutorials, idiosyncratic and rapidly outdated API docs, tutorials, demos and manuals. It reinvents a few wheels while trying to be helpful and there are occasional impedance mismatches between this PDE-first framework and the needs of ML, and a lot of opinionated design choices. Pet peeve: providing a unified API over various toolkits, which makes 80% of PDE tasks easy and the remaining 20% utterly baffling. I’m currently trying to discover how easy it is to stitch together PDEs and NNs manually and propagate gradients between them.
Most of the documentation on this page is about the Phiflow v2 API, and tested on Phiflow 2.1.4.
Documentation central
The Youtube tutorials are helpful. The “main” docs site is guide-like. API reference docs are stored separately: phi API documentation.
Arrays in phiflow
A.k.a. Tensors, which are wrappers around the tensor objects in whatever math backend phiflow is using.
Arrays have mandatory names which are used in broadcasting rules. This is best explained in the video tutorial, Working with Tensors. broadcasting between objects with “spatial”, “batch”, “instance” and “channel” dimensions is largely automatic. I guess “temporal” is implied somehow? Maybe via Scene objects? I do not use those.
Fields in phiflow
A sampling grid plus some sample data at grid locations gives us a SampledField
.
We could imagine fields that are defined in terms of some general mathematical function.
Indeed the documentation references an AnalyticField
, but this appears not to be implemented, so we can consider everything to be SampledField
for now.
CenteredGrid
objects are reasonably obvious.
Sampling
Fields can be sampled in a few different ways.
I am trying to learn which ones are differentiable.
Field.at
is probably usually what I want?
inf_field = field.at(
CenteredGrid(0, x=16, y=16, bounds = field.bounds),
keep_extrapolation=True)
But there are options. This also works:
inf_field = field.CenteredGrid(
field, x=16, y=16, bounds=field.bounds,
extrapolation=field.extrapolation)
Sampling overview explains some more.
Various sample
methods takes us from a Field
to a Tensor
.
math.downsample2x
and math.sample_subgrid address special grid relations.
field.sample and field.reduce_sample seem to accept arbitrary geometries (?), which is useful although I am still confused about how to specify useful Geometries.
Actual physics
I have few notes about this; the actual physics part is what phiflow makes easy.
- phi.physics API documentation:
- Fluids Tutorial: Introduction to core classes and fluid-related functions.
- Overview: Domains, built-in physics functions.
- Functions for Fluid Simulations: Advection, projection, diffusion
Optimisation
There are two types of optimisation supported in PhiFlow, with two different APIs.
One is the PhiFlow native optimisation, which optimises Fields
and Phiflow Tensor
objects.
This copies the scipy.minimize
interface.
Another is the NN-style SGD training, which optimises NN parameters. This looks like a normal SGD training loop, as per <insert favourite NN framework here>.
As usual with the scipy.minimize
style system, there is not much scope to see what is happening during the optimisation.
There is an example showing how to do that better in the Physics-based Deep Learning textbook Burgers Optimization with a Differentiable Physics Gradient, although it uses an outdated record_gradients
API.
A shorter but more modern example is in the cookbook.
ML
The API is idiosyncratic. Best explained through examples.
Visualization
Messy. They created a lovely UI for controlling simulations interactively, but that is messy and unsatisfactory because python GUIs suck and also jupyter GUIs suck, but and trying to serve two sucky masters is tedious.
As per the advice of lead developer Phillip Holl, I ignore the entire Vis
system which only works from command-line scripts.
I plot inside jupyter notebooks for now.
If I wanted something more sophisticated I might use the ΦFlow Web Interface which integrates with dash.
I am not sure why they do not just use one of the fairly standard tools for ML experiment tracking and visualisation such as tensorboard or whatever.
the developers of those tools have already experienced the many irritations of trying to do this stuff interactively and found workarounds.
Efficiency
Useful examples
Data storage
I do not use the native phiflow system, since I store everything in hdf5. but it is nice that it is documented:
No comments yet. Why not leave one?