Recurrent Neural Networks
Recurrent Neural Networks (RNNs) are a special family of neural networks that are designed to cope with sequential data (that is, time-series data), such as stock market prices or a sequence of texts (for example, variable-length sentences). RNNs maintain a state variable that captures the various patterns present in sequential data; therefore, they are able to model sequential data. In comparison, conventional feed-forward neural networks do not have this ability unless the data is represented with a feature representation that captures the important patterns present in the sequence. However, coming up with such feature representations is extremely difficult. Another alternative for feed-forward models to model sequential data is to have a separate set of parameters for each position in time/sequence so that the set of parameters assigned to a certain position learns about the patterns that occur at that position. This will greatly increase the memory requirement...