Encoding and tensors
Preparing time series data for deep learning models is already a bit different than working with more classic applications such as classification or regression. We twist and reshape our data by lagging our input series to create the input tensor. This pattern continues in our application of the base LSTM network. We’ll also talk a little bit about how the cell state, s(t), gets initiated as well.
We’ll do this by recapping the shape of our input data, then introducing the nodes required for reshaping our data table, and finally producing the tensor, which we will input into the LSTM model.
Input data
For this chapter and this use case, we’ll assume we have a single variable time series to forecast. This is a single column representing data of the same type that’s been recorded over time. We’ll also assume the data has been cleaned properly, as detailed in Chapter 3, Preparing Data for Time Series Analysis.
Note
...