NEAT
In Chapter 9, Architecture Optimization of Deep Learning Networks, we demonstrated how a simple genetic algorithm can be used to find the best architecture of a feed-forward neural network (also known as multilayer perceptron or MLP) for a particular task. To do that, we limited ourselves to three hidden layers and coded each network using a fixed-size chromosome that had placeholders for each of the layers, where a 0 or a negative value meant that the layer did not exist.
Taking this idea further, NEAT is an evolutionary technique dedicated to creating neural networks more flexibly and incrementally and was created in 2002 by Kenneth Stanley and Risto Miikkulainen.
NEAT starts with small, simple neural networks and allows them to evolve by adding and modifying neurons and connections over generations. Rather than using a fixed-size chromosome, NEAT represents solutions as directed graphs that directly map into artificial neural networks, where nodes represent neurons, and...