Summary
In this chapter, we started by exploring the comparison between artificial neurons (in the form of the perceptron) and biological neurons in the human brain. We then extended this idea to describe the activity of multiple neurons in an NN, both in terms of combining multiple perceptrons together and in terms of how the tiny neurons in our brain work together to produce extremely complex higher-level functions.
We then dived deeper into the inner workings and components of ANNs, including concepts such as activation functions and backpropagation. We discussed many different types of activation functions, including how they work and what use cases are most appropriate for them.
In the context of backpropagation, we learned about various types of commonly used cost function optimization algorithms, such as Momentum and Adam, and then we introduced two very important libraries for DL: TensorFlow and Keras.
Next, we built our first NN using those libraries, and we tested...