Need for RNN
Deep learning networks for natural language is numerical and deals well with multidimensional arrays of floats and integers, as input values. For categorical values, such characters or words, the previous chapter demonstrated a technique known as embedding for transforming them into numerical values as well.
So far, all inputs have been fixed-sized arrays. In many applications, such as texts in natural language processing, inputs have one semantic meaning but can be represented by sequences of variable length.
There is a need to deal with variable-length sequences as shown in the following diagram:
Recurrent Neural Networks (RNN) are the answer to variable-length inputs.
Recurrence can be seen as applying a feedforward network more than once at different time steps, with different incoming input data, but with a major difference, the presence of connections to the past, previous time steps, and in one goal, to refine the representation of input through time.
At each time step...