Summary
In this chapter, you were introduced to continuous search-space optimization problems and how they can be represented and solved using genetic algorithms, specifically by utilizing the DEAP framework. We then explored several hands-on examples of continuous function optimization problems—the Eggholder function, Himmelblau’s function, and Simionescu’s function—along with their Python-based solutions. In addition, we covered approaches for finding multiple solutions and for handling constraints.
In the next four chapters of the book, we will demonstrate how the various techniques we’ve learned so far in this book can be applied when solving machine learning (ML)- and artificial intelligence (AI)-related problems. The first of these chapters will provide a quick overview of supervised learning (SL) and then demonstrate how genetic algorithms can improve the outcome of learning models by selecting the most relevant portions of the given dataset...