Summary
In this chapter, you have learned how to launch the JupyterLab environment to run TensorFlow Enterprise. TensorFlow Enterprise is available in three different forms: AI Platform Notebook, DLVM, and a Docker container. The computing resources used by these methods can be found in the Google Cloud Compute Engine panel. These compute nodes do not shut down on their own, therefore it is important to stop or delete them once you are done using them.
The BigQuery command tool is seamlessly integrated with the TensorFlow Enterprise environment. Parameterized data extraction via the use of a SQL query string enables the quick and easy creation of a derived dataset and feature selection.
TensorFlow Enterprise works even when your data is not yet in Google Cloud storage. By pulling and running the TensorFlow Enterprise Docker container, you can use it with on-premises or local data sources.
Now that you have seen how to leverage data availability and accessibility for TensorFlow...