Understanding the ResNet architecture
When building too deep a network, there are two problems. In forward propagation, the last few layers of the network have almost no information about what the original image was. In backpropagation, the first few layers near the input hardly get any gradient updates due to vanishing gradients (in other words, they are almost zero). To solve both problems, ResNet uses a highway-like connection that transfers raw information from the previous few layers to the later layers. In theory, even the last layer will have the entire information of the original image due to this highway network. And because of the skipping layers, the backward gradients will flow freely to the initial layers with little modification.
The term residual in the residual network is the additional information that the model is expected to learn from the previous layer that needs to be passed on to the next layer.
A typical residual block appears as follows:
Figure...