Early stopping in neural network training
The epoch is a measure of each round trip from the forward propagation training and backpropagation update of weights and biases. The round trip of training has to stop once we have convergence (minimal error terms) or after a preset number of iterations.
Early stopping is a technique used to deal with overfitting of the model (more on overfitting in the next few pages). The training set is separated into two parts: one of them is to be used for training, while the other one is meant for validation purposes. We had separated our IRIS
dataset into two parts: one 75 percent and another 25 percent.
With the training data, we compute the gradient and update the network weights and biases. The second set of data, the testing or validation data, is used to validate the model overfitting. If the error during validation increases for a specified number of iterations (nnet.abstol
/reltol
), the training is stopped and the weights and biases at that point are...