In this chapter, we introduced neural networks in detail and we mentioned their success vis-Ã -vis other competing algorithms. Neural networks are comprised of interconnected neurons (or units), where the weights of the connections characterize the strength of the communication between different neurons. We discussed different network architectures, and how a neural network can have many layers, and why inner (hidden) layers are important. We explained how the information flows from the input to the output by passing from each layer to the next based on the weights and the activation function, and finally, we showed how to train neural networks, that is, how to adjust their weights using gradient descent and backpropagation.
In the next chapter, we'll continue discussing deep neural networks, and we'll explain in particular the meaning of deep in deep learning...