In recent years, the recurrent neural network model has presented fascinating results which can even be seen in real-life applications like language translation, speech synthesis and more. A phenomenal application of GRUs happens to be text generation. With the current state-of-the-art models, we can see results which, a decade ago, were just a dream. If you want to truly appreciate these results, I strongly recommend you read Andrej Karpathy's article on The Unreasonable Effectiveness of Recurrent Neural Networks (http://karpathy.github.io/2015/05/21/rnn-effectiveness/).
Having said that, we can introduce the Gated Recurrent Unit (GRU) as a model which sits behind these exceptional outcomes. Another model of that kind is the Long Short-Term Memory (LSTM) which is slightly more advanced. Both architectures aim to...