TensorFlow supports auto-differentiation; we can use TensorFlow optimizer to calculate and apply gradients. It automatically updates the tensors defined as variables using the gradients. In this recipe, we will use the TensorFlow optimizer to train the network.
MNIST classifier using MLP
Getting ready
In the backpropagation algorithm recipe, we defined layers, weights, loss, gradients, and update through gradients manually. It is a good idea to do it manually with equations for better understanding but this can be quite cumbersome as the number of layers in the network increases.
In this recipe, we will use powerful TensorFlow features such as Contrib (Layers) to define neural network layers and TensorFlow's own optimizer...