The most popular method to prevent overfitting in neural networks is adding dropouts. In Chapter 2, Feed-Forward Neural Networks, we introduced dropouts, and we've used dropouts throughout the book. In the following recipe, we demonstrate, just like Chapter 2, Feed-Forward Neural Networks, the difference in performance when adding dropouts. This time, we will be using the cifar10 dataset.
Adding dropouts to prevent overfitting
How to do it...
- We start by importing all libraries as follows:
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation, Flatten
from keras.layers import Conv2D, MaxPooling2D
from keras.optimizers import Adam
from sklearn.model_selection import train_test_split
from...