Sequence padding
In this recipe, we will learn how Keras can be used for sequence padding. Padding is useful when sequences are sent in batches to the LSTM network.Â
Getting ready
Import the function:
from keras.preprocessing.sequence import pad_sequences
pad_sequences
is a function defined as follows:
pad_sequences(sequences, maxlen=None, dtype='int32', padding='pre', truncating='pre', value=0.0)
How to do it...
Let's look at the various padding options.
Pre-padding with default 0.0 padding
First, let's look at how to use pad_sequences
with default pre-padding:
from keras.preprocessing.sequence import pad_sequences # define sequences sequences = [ [1, 2, 3, 4], [5, 6, 7], [5] ] # pad sequence padded = pad_sequences(sequences) print(padded)
An output of the preceding print
statement will show all the sequences padded to length 4.Â
Post-padding
To pad 0.0 on at the end of shorter arrays, use padding='post'
, as shown in the following code snippet:
padded_post = pad_sequences(sequences,padding=...