The most basic form of an ANN is known as a feedforward network, sometimes called a multi-layer perceptron. These models, while simplistic in nature, contain the core building blocks for the various types of ANN that we will examine going forward.
In essence, a feedforward neural network is nothing more than a directed graph; there are no loops of recurrent connections between the layers, and information simply flows forward through the graph. Traditionally, when these networks are illustrated, you'll see them represented as in the following diagram:
In this most basic form, ANNs are typically organized into three basic layers; an input layer, a hidden layer, and an output layer, each made up of many basic input/output processing units commonly referred to as neurons. While it's helpful to view networks as basic...