Chapter 1, The Nuts and Bolts of Neural Networks, will briefly introduce what deep learning is and then discuss the mathematical underpinnings of NNs. This chapter will discuss NNs as mathematical models. More specifically, we'll focus on vectors, matrices, and differential calculus. We'll also discuss some gradient descent variations, such as Momentum, Adam, and Adadelta, in depth. We will also discuss how to deal with imbalanced datasets.
Chapter 2, Understanding Convolutional Networks, will provide a short description of CNNs. We'll discuss CNNs and their applications in CV
Chapter 3, Advanced Convolutional Networks, will discuss some advanced and widely used NN architectures, including VGG, ResNet, MobileNets, GoogleNet, Inception, Xception, and DenseNets. We'll also implement ResNet and Xception/MobileNets using PyTorch.
Chapter 4, Object Detection and Image Segmentation, will discuss two important vision tasks: object detection and image segmentation. We'll provide implementations for both of them.
Chapter 5, Generative Models, will begin the discussion about generative models. In particular, we'll talk about generative adversarial networks and neural style transfer. The particular style transfer will be implemented later.
Chapter 6, Language Modeling, will introduce word and character-level language models. We'll also talk about word vectors (word2vec, Glove, and fastText) and we'll use Gensim to implement them. We'll also walk through the highly technical and complex process of preparing text data for machine learning applications such as topic modeling and sentiment modeling with the help of the Natural Language ToolKit's (NLTK) text processing techniques.
Chapter 7, Understanding Recurrent Networks, will discuss the basic recurrent networks, LSTM, and GRU cells. We'll provide a detailed explanation and pure Python implementations for all of the networks.
Chapter 8, Sequence-to-Sequence Models and Attention, will discuss sequence models and the attention mechanism, including bidirectional LSTMs, and a new architecture called transformer with encoders and decoders.
Chapter 9, Emerging Neural Network Designs, will discuss graph NNs and NNs with memory, such as Neural Turing Machines (NTM), differentiable neural computers, and MANN.
Chapter 10, Meta Learning, will discuss meta learning—the way to teach algorithms how to learn. We'll also try to improve upon deep learning algorithms by giving them the ability to learn more information using less training samples.
Chapter 11, Deep Learning for Autonomous Vehicles, will explore the applications of deep learning in autonomous vehicles. We'll discuss how to use deep networks to help the vehicle make sense of its surrounding environment.