Understanding TensorFlow Serving with Docker
At the core of TFS is actually a TensorFlow model server that runs a model Protobuf file. Installing the model server is not straightforward, as there are many dependencies. As a convenience, the TensorFlow team also provides this model server in a Docker container, which is a platform that uses virtualization at the operating system level, and it is self-contained with all the necessary dependencies (that is, libraries or modules) to run in an isolated environment.
Therefore, the easiest way to deploy a TensorFlow SavedModel
is by means of TFS with a Docker container. To install Docker, you can refer to the Docker site (https://docs.docker.com/install/), along with the instructions for Mac, Windows, or Linux installations. For our chapter, a community version will suffice. We will be using Docker Desktop 2.4 running in macOS Catalina 10.15.6 with specs as indicated in Figure 9.1: