Building a smarter network with batch normalization
We normalize the input that we provide to the network, constraining the range from 0 to 1, so it could be beneficial to do that also in the middle of the network. This is called batch normalization, and it does wonders!
In general, you should add the batch normalization after the output that you want to normalize, and before the activation, but adding it after the activation might provide faster performance, and this is what we will do.
This is the new code (dense layers omitted):
model.add(Conv2D(filters=16, kernel_size=(3, 3), activation='relu', input_shape=x_train.shape[1:], padding="same"))model.add(Conv2D(filters=16, kernel_size=(3, 3), activation='relu', input_shape=x_train.shape[1:], padding="same"))model.add(BatchNormalization())model.add(AveragePooling2D())model.add(Conv2D(filters=32, kernel_size=(3, 3), activation='relu&apos...