A family of algorithms inspired by nature, which work by trying lots of tactics, seeing what works and discarding what doesn’t, much as evolution itself acts on us, sort of. Basic, sturdy genetic algorithms, John Holland style. Genetic Programming, Koza-style. Evolutionary strategies, à la Bienert, Techenberg and Schwefel. Simple; available to amateurs. Mind you, the same goes for neural networks these days, and they are better in most circumstances. Genetic algorithms can be made robust against noisy fitness functions. They can solve some tricky problems. And, compared to more specialised algorithms, they are outrageously slow and profligate.
So I don’t use them practically, although they are interesting to think about as theoretical models for systems in the real world, such as evolution itself.
Things to understand
- Evolving “robustness”, and “modularity”, whatever those are.
- Generalising from an evolutionary to a market metaphor for fitness, or finding some nice combination of those two metaphors.
- When is an evolutionary algorithms the best you can do? Why?