In this chapter, we learned how to deal with class imbalances. This is a recurrent problem in machine learning, where most of the value lies in the minority class. This phenomenon is common enough that the black swan metaphor was coined to explain it. When the machine learning algorithms try to blindly optimize their out-of-the-box objective functions, they usually miss those black swans. Hence, we have to use techniques such as sample weighting, sample removal, and sample generation to force the algorithms to meet our own objectives.
This was the last chapter in this book about supervised learning algorithms. There is a rough estimate that 80% of the machine learning problems in business setups and academia are supervised learning ones, which is why about 80% of this book focused on that paradigm. From the next chapter onward, we will start covering the other machine learning paradigms, which is where about 20% of the real-life value resides. We will start by...