We're going to be talking about why nonlinearity matters, and then we'll look at some visualizations of the two most commonly used nonlinear functions: sigmoid and relu.
So, nonlinearity may sound like a complicated mathematical concept, but all you basically need to know is that it doesn't go in a straight line. This allows neural networks to learn more complex shapes, and this learning of complex shapes inside of the structure of the network is what lets neural networks and deep learning actually learn.
So, let's take a look at the sigmoid function:
It's kind of an S-curve that ranges from zero to one. It's actually built out of e to an exponent and a ratio. Now, the good news is that you'll never actually have to code the math that you see here, because when we want to use sigmoid in Keras, we...