Auto differentiation is also known as algorithmic differentiation, which is an automatic way of numerically computing the derivatives of a function. It is helpful for computing gradients, Jacobians, and Hessians for use in applications such as numerical optimization. Backpropagation algorithm is an implementation of the reverse mode of automatic differentiation for calculating the gradient.
In the following example, using the mnist dataset, we calculate the loss using one of the loss functions. The question is: how do we fit the model to the data?
We can use tf.train.Optimizer and create an optimizer. tf.train.Optimizer.minimize(loss, var_list) adds an optimization operation to the computational graph and automatic differentiation computes gradients without user input:
import TensorFlow as tf
# get mnist dataset
from TensorFlow .examples.tutorials.mnist...