Language computational models based on RNNs are nowadays among the most successful techniques for statistical language modeling. They can be easily applied in a wide range of tasks, including automatic speech recognition and machine translation.
In this section, we'll explore an RNN model on a challenging task of language processing, guessing the next word in a sequence of text.
You'll find a complete reference for this example in the following page:
https://www.tensorflow.org/versions/r0.8/tutorials/recurrent/index.html.
https://www.tensorflow.org/versions/r0.8/tutorials/recurrent/index.html.
You can download the source code for this example here (official TensorFlow project GitHub page):
https://github.com/tensorflow/models/tree/master/tutorials/rnn/ptb.
The files to download are as follows:
- ptb_word_lm.py: This file contains code to train the model on the PTB dataset
- reader.py: This file contains code to read the dataset
Here we just present only the main...