Confusion matrix
A Confusion matrix is a figure or a table that is used to describe the performance of a classifier. It is usually extracted from a test dataset for which the ground truth is known. We compare each class with every other class and see how many samples are misclassified. During the construction of this table, we actually come across several key metrics that are very important in the field of machine learning. Let's consider a binary classification case where the output is either 0 or 1:
True positives: These are the samples for which we predicted 1 as the output and the ground truth is 1 too.
True negatives: These are the samples for which we predicted 0 as the output and the ground truth is 0 too.
False positives: These are the samples for which we predicted 1 as the output but the ground truth is 0. This is also known as a Type I error.
False negatives: These are the samples for which we predicted 0 as the output but the ground truth is 1. This is also known as a Type II...