Summary
In this chapter, you have explored gradient-based and gradient-free methods of algorithm optimization, with an emphasis on the potential of evolutionary algorithms – in particular, GAs – to solve optimization problems, such as sub-optimal solutions, using a nature-inspired approach. GAs consist of specific elements, such as population generation, parent selection, parent reproduction or crossover, and finally mutation occurrence, which they use to create a binary optimal solution.
Then, the use of GAs for hyperparameter tuning and selection for neural networks was explored, helping us to find the most suitable window size and unit number. We saw implementations of state-of-the-art algorithms that combined deep neural networks and evolutionary strategies, such as NEAT for XNOR output estimation. Finally, you had a chance to implement what was studied in this chapter through an OpenAI Gym cart-pole simulation, where we examined the application of GAs for parameter...