Early stopping
When should we stop training? That's a good question! Ideally, you want to stop at the minimum validation error. While you cannot know this in advance, you can check the losses and get an idea of how many epochs you need. However, when you train your network, sometimes you need more epochs depending on how you tune your model, and it is not simple to know in advance when to stop.
We already know that we can use ModelCheckpoint
, a callback of Keras, to save the model with the best validation error seen during training.
But there is also another very useful callback, EarlyStopping
, which stops the training when a predefined set of conditions happen:
stop = EarlyStopping(min_delta=0.0005, patience=7, verbose=1)
The most important parameters to configure early stopping are the following:
monitor
: This decides which parameter to monitor, by default: validation loss.min_delta
: If the difference in validation loss between epochs is below this value...