We started off the chapter by understanding what deep learning is and how it differs from machine learning. Later, we learned how biological and artificial neurons work, and then we explored what is input, hidden, and output layer in the ANN, and also several types of activation functions.
Going ahead, we learned what forward propagation is and how ANN uses forward propagation to predict the output. After this, we learned how ANN uses backpropagation for learning and optimizing. We learned an optimization algorithm called gradient descent that helps the neural network to minimize the loss and make correct predictions. We also learned about gradient checking, a technique that is used to evaluate the gradient descent. At the end of the chapter, we implemented a neural network from scratch to perform the XOR gate operation.
In the next chapter, we will learn about one of...