Chapter 6: Recurrent Neural Networks for Demand Prediction
We have gathered some experience, by now, with fully connected feedforward neural networks in two variants: implementing a classification task by assigning an input sample to a class in a set of predefined classes or trying to reproduce the shape of an input vector via an autoencoder architecture. In both cases, the output response depends only on the values of the current input vector. At time , the output response, , depends on, and only on, the input vector, , at time . The network has no memory of what came before and produces only based on input .
With Recurrent Neural Networks (RNNs), we introduce the time component . We are going to discover networks where the output response, , at time depends on the current input sample, , as well as on previous input samples, , , … , where the memory of the network of the past samples depends on the network architecture.
We will first introduce the general concept...