Chapter 11: Catastrophic Forgetting
In the previous two chapters, we started to look at a number of auxiliary tasks for online machine learning and working with streaming data. Chapter 9 covered drift detection and solutions and Chapter 10 covered feature transformation and scaling in a streaming context. The current chapter introduces a third and final topic to this list of auxiliary tasks, namely catastrophic forgetting.
Catastrophic forgetting, also known as catastrophic interference, is the tendency of machine learning models to forget what they have learned upon new updates, wrongly de-learning correctly learned older tendencies as new tendencies are learned from new data.
As you have seen a lot of examples of online models throughout this book, you will understand that continuous updating of the models creates a large risk of this learning going wrong. It has already been touched upon briefly, in the chapter on drift and drift detection, that model learning going wrong...