Performance Metrics
In the case of classification algorithms, we use a confusion matrix, which gives us the performance of the learning algorithm. It is a square matrix that counts the number of True Positive (TP), True Negative (TN), False Positive (FP), and False Negative (FN) outcomes:
Figure 8.45: Confusion matrix
For the sake of simplicity, let's use 1 as the positive class and 0 as a negative class, then:
TP: The number of cases that were observed and predicted as 1.
FN: The number of cases that were observed as 1 but predicted as 0.
FP: The number of cases that were observed as 0 but predicted as 1.
TN: The number of cases that were observed as 0 and predicted as 0.
Consider the same case study of predicting whether a product will be returned or not. In that case, the preceding metrics can be understood using the following table:
Figure 8.46: Understanding the metrics
Precision
Precision is the ability of a classifier to not label a sample that is...