Adding Long Short-Term Memory (LSTM)
One limitation of a simple RNN is that it only accounts for the direct inputs around the current input. In many applications, and specifically language, one needs to understand the context of the sentence in a larger part as well. This is why LSTM has played an role in applying Deep Learning to unstructured data types such as text. An LSTM unit has an input, forget, and output gate, as is shown in Figure 4.2:
Figure 4.2: Example flow in an LSTM unit
In the following recipe, we will be classifying reviews from the IMDB dataset using the Keras framework.
How to do it...
- Let's start with the libraries as follows:
import numpy as np from keras.preprocessing import sequence from keras.models import Sequential from keras.layers import Dense, Dropout, Activation from keras.layers import Embedding from keras.layers import LSTM from keras.datasets import imdb
- We will be using the IMDB dataset from Keras; load the data with the following code:
n_words = 1000 (X_train...