Summary
In this chapter, you explored different recurrent models for sequential data. You learned that each sequential data point is dependent on the prior sequence of data points, such as natural language text. You also learned why you must use models that allow for the sequence of data to be used by the model, and sequentially generate the next output.
This chapter introduced RNN models that can make predictions for sequential data. You observed the way RNNs can loop back on themselves, which allows the output of the model to feed back into the input. You reviewed the types of challenges that you face with these models, such as vanishing and exploding gradients, and how to address them.
In the next chapter, you will learn how to utilize custom TensorFlow components to use within your models, including loss functions and layers.