In this recipe, we will build an RNN from scratch using a toy example, so that you gain a solid intuition of how RNN helps in solving the problem of taking the order of events (words) into consideration.
Building an RNN from scratch in Python
Getting ready
Note that a typical NN has an input layer, followed by an activation in the hidden layer, and then a softmax activation at the output layer.
RNN follows a similar structure, with modifications done in such a way that the hidden layers of the previous time steps are considered in the current time step.
We'll build the working details of RNN with a simplistic example before implementing it on more practical use cases.
Let's consider an example text that looks as...