On instructing cells to grow into differentiated bodies. This notebook has been resurrected from the trash bin years after I deleted it because of my great enjoyment of Mordvintsev et al. (2020).
Mordvintsev et al. (2020) is a fun paper. They improve upon boring old school cellular automata in several ways (not all of which are completely novel, but seem to be a novel combination.)
- Continuous states whose rules can be differentiably learned
- Use of Sobel filters for CA based on local gradients
- Framing the problem as “designing attractors of a dynamical systems”
- Clever use of noise in the training.
image from Growing Neural Cellular Automata
I am instinctively annoyed by the unfashionable loss function which is not any kind of optimal transport metric, but hey, it works so don’t listen to me about that. Logical extensions such as creating a model which can produce different patterns parametrically and interpolate between them also seem to leap out at me. I am also curious about an information bottleneck analysis which gives us restrictions on what patterns can be learned.
More general models of morphogenesis are out there, obviously. I will not even touch upon how this happens in real creatures as opposed to fake emoji monsters for now.
TBD: connection to bio computing, and the particular special case, models of pattern formation.
No comments yet. Why not leave one?