Introducing popular neural network architectures
In this chapter, we explore some of the most popular ANN architectures beyond the basic single-layer and multilayer perceptrons. The recurrent neural network (RNN) is a feed-forward network that incorporates temporal relationships and is widely used in applications ranging from sentence autocompletion to stock market predictions. However, RNNs suffer from the vanishing gradient problem, which hinders their learning ability. Competitive networks, such as Kohonen networks and self-organizing maps, classify inputs without supervision, while Hopfield networks, a special ANN with every node connected to every other node, act as associative memory and tend to converge based on similarities. Boltzmann machines (BMs) and restricted Boltzmann machines (RBMs) are variants of the Hopfield network that have additional restrictions and are trained using unsupervised learning approaches, making them great at extracting discriminative features from...