We will now use the information we gained in the Performing a grid search using scikit-learn section to optimize other aspects of our model. It looks like we might be overfitting the data a little bit, as we are getting better results on our training data than our testing data. We're now going to look at adding in dropout regularization:
- Our first step is to copy the code that is present in the grid search cell that we ran in the previous section, and paste it in a fresh cell. We will keep the general structure of the code and play around with some of the parameters present.
- We will then import the Dropout function from keras.layers using the following line:
from keras.layers import Dropout
- We will now convert the learning rate into a variable by defining it in the Adam optimizer code block. We will use learn_rate as...