8.2 Detecting out-of-distribution data
Typical neural networks do not handle out-of-distribution data well. We saw in Chapter 3, Fundamentals of Deep Learning that a cat-dog classifier classified an image of a parachute as a dog with more than 99% confidence. In this section, we will look into what we can do about this vulnerability of neural networks. We will do the following:
Explore the problem visually by perturbing a digit of the
MNIST
datasetExplain the typical way out-of-distribution detection performance is reported in the literature
Review the out-of-distribution detection performance of some of the standard practical BDL methods we look at in this chapter
Explore even more practical methods that are specifically tailored to detect out-of-distribution detection
8.2.1 Exploring the problem of out-of-distribution detection
To give you a better understanding of what out-of-distribution performance is like, we will start with a visual example. Here is what we will do...