Summary
In this chapter, we discussed what generative modeling is, and how it fits into the landscape of more familiar machine learning methods. I used probability theory and Bayes' theorem to describe how these models approach prediction in an opposite manner to generative learning.
We reviewed use cases for generative learning, both for specific kinds of data and general prediction tasks. Finally, we examined some of the specialized challenges that arise from building these models.
In the next chapter, we will begin our practical implementation of these models by exploring how to set up a development environment for TensorFlow 2.0 using Docker and Kubeflow.