We use a confusion matrix to validate the performance of a classification model on the test data for which the true values are known.
To view the confusion matrix of the model, execute the following code:
y_pred=model.predict(x_test)
y_pred=np.argmax(y_pred, axis=1)
y_test=np.argmax(y_test, axis=1)
from sklearn.metrics import confusion_matrix
confusion_matrix = confusion_matrix(y_test, y_pred)
confusion_matrix
The confusion matrix of the model looks as follows:
Fig 6.25: The confusion matrix
You can create the confusion matrix in a more advanced way with the following code:
# Confussion matrix
import itertools
from sklearn.metrics import confusion_matrix
def plot_confusion_matrix(cm, classes,
normalize=False,
title='Confusion matrix',
cmap=plt.cm.Blues):
plt.imshow(cm, interpolation='nearest', cmap=cmap)
plt.title(title)
plt.colorbar()
tick_marks...