When we think about how a human thinks, we don't just observe a situation once; we constantly update what we're thinking based on the context of the situation. Think about reading a book: each chapter is an amalgamation of words that make up its meaning. Vanilla feedforward networks don't take sequences as inputs, and so it becomes very difficult to model unstructured data such as natural language. RNNs can help us achieve this.
The building blocks of RNNs
Basic structure
RNNs differ from other networks in the fact that they have a recursive structure; they are recurring over time. RNNs utilize recursive loops, which allow information to persist within the network. We can think of them as multiple copies of...