- Why is the MLP better than the perceptron model?
The larger number and layers of neurons give the MLP the advantage over the perceptron to model non-linear problems and solve much more complicated pattern recognition problems.
- Why is backpropagation so important to know about?
Because it is what makes neural networks learn in the era of big data.
- Does the MLP always converge?
Yes and no. It does always converge to a local minimum in terms of the loss function; however, it is not guaranteed to converge to a global minimum since, usually, most loss functions are non-convex and non-smooth.
- Why should we try to optimize the hyperparameters of our models?
Because anyone can train a simple neural network; however, not everyone knows what things to change to make it better. The success of your model depends heavily on you trying different things and proving to yourself (and others) that your model is the best that it can be. This is what will make you a better...