This chapter covered various advances in neural network architectures and their application to a varied set of real-world problems. We discussed the need for these architectures and why a simple deep multilayer neural network won't sufficiently solve all sorts of problems, given that it has great expressive power and a rich hypothesis space. Many of these architectures discussed will be used in later chapters when covering transfer learning use cases. References to the Python code for almost all the architectures is provided. We have also tried to clearly explain some of the very recent architectures, such as CapsNet, MemNNs, and NTMs. We will be frequently referring back to this chapter while walking you through the transfer learning use cases.
The next chapter will introduce the concepts of transfer learning.