You have now learned how to build a CNN and how to tune some of the hyperparameters (such as the number of epochs and batch sizes) in order to get the desired result and get it running smoothly on different computers.
As an exercise, you should try training this model to recognize MNIST digits, and even change around the structure of the convolutional layers; try Batch Normalization, and perhaps even more weights in the fully connected layer.
The next chapter will give an introduction to reinforcement learning and Q-learning and how to build a DQN and solve a maze.