Modeling Neural Networks
This chapter is an introduction to the world of deep learning, whose methods make it possible to achieve state-of-the-art performance in many classification and regression fields often considered extremely difficult to manage (such as image segmentation, automatic translation, voice synthesis, and so on). The goal is to provide the reader with the basic instruments to understand the structure of a fully connected neural network using Keras (employing modern techniques to speed up the training process and prevent overfitting).
In particular, the topics covered in the chapter are as follows:
- The structure of a basic artificial neuron
- Perceptrons, linear classifiers, and their limitations
- Multilayer perceptrons with the most important activation functions (such as ReLU)
- Back-propagation algorithms based on the stochastic gradient descent (SGD) optimization method
Let's start this exploration with a formal definition...