As practice, let's implement a basic residual block ourselves. As shown in Figure 4.5, the residual path consists of two convolutional layers, each one followed by batch normalization. The ReLU activation function is applied directly after the first convolution. For the second, the function is only applied after merging with the other path. Using the Keras Functional API, the residual path can thus be implemented in a matter of five or six lines, as demonstrated in the following code.
The shortcut path is even simpler. It contains either no layer at all, or a single 1 × 1 convolution to reshape the input tensor when the residual path is altering its dimensions (for instance, when a larger stride is used).
Finally, the results of the two paths are added together, and the ReLU function is applied to the sum. All in all, a basic residual block can be implemented as follows:
from tf.keras.layers import Activation, Conv2D, BatchNormalization...