Deep Neural Networks
In this chapter, we'll be examining deep neural networks. These networks have shown excellent performance in terms of the accuracy of their classification on more challenging datasets like ImageNet, CIFAR10 (https://www.cs.toronto.edu/~kriz/learning-features-2009-TR.pdf), and CIFAR100. For conciseness, we'll only be focusing on two networks: ResNet [2][4] and DenseNet [5]. While we will go into much more detail, it's important to take a minute to introduce these networks.
ResNet introduced the concept of residual learning, which enabled it to build very deep networks by addressing the vanishing gradient problem (discussed in section 2) in deep convolutional networks.
DenseNet improved ResNet further by allowing every convolution to have direct access to inputs, and lower layer feature maps. It's also managed to keep the number of parameters low in deep networks by utilizing both the Bottleneck and Transition layers...