Improving the validation accuracy with dropout
A source of overfitting is the fact that the neural network relies more on some neurons to draw its conclusions, and if those neurons are wrong, the network is wrong. One way to reduce this problem is simply to randomly shut down some neurons during training while keeping them working normally during inference. In this way, the neural network learns to be more resistant to errors and to generalize better. This mechanism is called dropout, and obviously, Keras supports it. Dropout increases the training time, as the network needs more epochs to converge. It might also require a bigger network, as some neurons are randomly deactivated during training. It is also more useful when the dataset is not very big for the network, as it is more likely to overfit. In practice, as dropout is meant to reduce overfitting, it brings little benefit if your network is not overfitting.
A typical value of dropout for dense layers is 0.5, though we might...