3. ResNet v2
The improvements for ResNet v2 are mainly found in the arrangement of layers in the residual block as shown in Figure 2.3.1.
The prominent changes in ResNet v2 are:
- The use of a stack of 1 x 1 – 3 x 3 – 1 × 1
BN-ReLU-Conv2D
- Batch normalization and ReLU activation come before two dimensional convolution
Figure 2.3.1: A comparison of residual blocks between ResNet v1 and ResNet v2
ResNet v2 is also implemented in the same code as resnet-cifar10-2.2.1.py
, as can be seen in Listing 2.2.1:
Listing 2.2.1: resnet-cifar10-2.2.1.py
def resnet_v2(input_shape, depth, num_classes=10):
"""ResNet Version 2 Model builder [b]
Stacks of (1 x 1)-(3 x 3)-(1 x 1) BN-ReLU-Conv2D or
also known as bottleneck layer.
First shortcut connection per layer is 1 x 1 Conv2D.
Second and onwards shortcut connection is identity.
At the beginning of each stage,
the feature map size...