Input handling for loading data
Many common examples that we typically see tend to focus on the modeling aspect, such as how to build a deep learning model using TensorFlow with various layers and patterns. In these examples, the data used is almost always loaded into the runtime memory directly. This is fine as long as the training data is sufficiently small. But what if it is much larger than your runtime memory can handle? The solution is data streaming. We have been using this technique to feed data into our model in the previous chapters, and we are going to take a closer look at data streaming and generalize it to more data types.
The streaming data technique is very similar to a Python generator. Data is ingested into the model training process in batches, meaning that all the data is not sent at one time. In this chapter, we are going to use an example of flower image data. Even though this data is not big by any means, it is a convenient tool for our teaching and learning...