Keras makes training extremely simple:
model.compile(optimizer='sgd',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
model.fit(x_train, y_train, epochs=5, verbose=1, validation_data=(x_test, y_test))
Calling .compile() on the model we just created is a mandatory step. A few arguments must be specified:
- optimizer: This is the component that will perform the gradient descent.
- loss: This is the metric we will optimize. In our case, we choose cross-entropy, just like in the previous chapter.
- metrics: These are additional metric functions evaluated during training to provide further visibility of the model's performance (unlike loss, they are not used in the optimization process).
The Keras loss named sparse_categorical_crossentropy performs the same cross-entropy operation as categorical_crossentropy, but the former directly takes the ground truth labels as inputs, while the latter requires the ground truth labels to be...