In this chapter, we explored some modern architectures, such as ResNet, Inception, and DenseNet. We also explored how we can use these models for transfer learning and ensembling, and introduced the encoder–decoder architecture, which powers a lot of systems, such as language translation systems.
In the next chapter, we will arrive at a conclusion of what we have achieved in our learning journey through the book, as well as discuss where can you go from here. We will visit a plethora of resources on PyTorch and some cool deep learning projects that have been created or are undergoing research using PyTorch.