Simple Recurrent Neural Network
Here is what a simple neural network with loops looks like:

RNN Network
In this diagram, a Neural Network N takes input

to produce output

. Due to the loop, at the next time step

, it takes the input

along with input

to produce output

. Mathematically, we represent this as the following equation:

When we unroll the loop, the RNN architecture looks as follows at time step

:

Unrolled RNN at timestep t1
As the time steps evolve, this loop unrolls as follows at time step 5:

Unrolled RNN at timestep t5
At every time step, the same learning function,

, and the same parameters, w and b, are used.
The output y is not always produced at every time step. Instead, an output h is produced at every time step, and another activation function is applied to this output h to produce the output y. The equations for the RNN look like this now:


where,
- is the weight vector for x inputs that are connected to the hidden layer
- is the weight vector for the value of h from the previous time...