In all previous recipes, we've only considered static network architectures. More specifically, while training our network or agents the network didn't change. What we've also seen is that the network architecture and the hyperparameters can have a big affect on the results. However, often we don't know if a network will perform well or not in advance so we need to test it thoroughly. There are different ways to optimize these hyperparameters. In Chapter 12, Hyperparameter Selection, Tuning, and Neural Network Learning, we demonstrate how to apply a grid search (with brute force) to find optimal hyperparameters. However, sometimes the hyperparameter space is enormous and using brute force will take too much time.
Evolutionary Algorithms (EA) have proven to be powerful. One of the most impressive outcomes is...