Introduction to neural networks
Artificial neural networks (briefly, “nets” or ANNs) represent a class of machine learning models loosely inspired by studies about the central nervous systems of mammals. Each ANN is made up of several interconnected “neurons,” organized in “layers.” Neurons in one layer pass messages to neurons in the next layer (they “fire,” in jargon terms) and this is how the network computes things. Initial studies were started in the early 1950s with the introduction of the “perceptron” [1], a two-layer network used for simple operations, and further expanded in the late 1960s with the introduction of the “back-propagation” algorithm used for efficient multi-layer network training (according to [2] and [3]). Some studies argue that these techniques have roots dating further back than normally cited [4].
Neural networks were a topic of intensive academic studies up until the...