In Keras, adding a dropout layer is also very simple. All you are required to do is use the model.add() parameter again, and then specify a dropout layer (instead of the dense layer that we've been using so far) to be added. The Dropout parameter in Keras takes a float value that refers to the fraction of neurons whose predictions will be dropped. A very low dropout rate might not provide the robustness we are looking for, while a high dropout rate simply means we have a network prone to amnesia, incapable of remembering any useful representations. Once again, we strive for a dropout value that is just right; conventionally, the dropout rate is set between 0.2 and 0.4:
#Simple feed forward neural network
model=Sequential()
#feeds in the image composed of 28 28 a pixel matrix as one sequence of 784
model.add(Flatten(input_shape=(28...