Customizing the training process using tf.GradientTape
One of the biggest competitors of TensorFlow is another well-known framework: PyTorch. What made PyTorch so attractive until the arrival of TensorFlow 2.x was the level of control it gives to its users, particularly when it comes to training neural networks.
If we are working with somewhat traditional neural networks to solve common problems, such as image classification, we don't need that much control over how to train a model, and therefore can rely on TensorFlow's (or the Keras API's) built-in capabilities, loss functions, and optimizers without a problem.
But what if we are researchers that are exploring new ways to do things, as well as new architectures and novel strategies to solve challenging problems? That's when, in the past, we had to resort to PyTorch, due to it being considerably easier to customize the training models than using TensorFlow 1.x, but not anymore! TensorFlow 2.x's tf...