To explore further image classification improvement, in this section, we will try three experiments. In the first experiment, we will mainly use the adam optimizer when compiling the model. In the second experiment, we will carry out hyperparameter tuning by varying the number of units in the dense layer, the dropout percentage in the dropout layer, and the batch size when fitting the model. Finally, in the third experiment, we will work with another pretrained network called VGG16.
Performance optimization tips and best practices
Experimenting with the adam optimizer
In this first experiment, we will use the adam optimizer when compiling the model. At the time of training the model, we will also increase the number of epochs...