So far, in the previous recipes, we used the default learning rate of the Adam optimizer, which is 0.0001.
In this section, we will manually set the learning rate to a higher number and see the impact of changing the learning rate on model accuracy, while reusing the same MNIST training and test dataset that were scaled in the previous recipes.