Using TensorFlow Serving to serve models
In this section, we will use TensorFlow Serving to serve models. First, we will use the recommended mechanism of using TensorFlow with Docker. The official page presenting this example is https://www.tensorflow.org/tfx/serving/docker.
TensorFlow Serving with Docker
Make sure Docker is installed on your platform. Follow the link provided in the Technical requirements section to start Docker. Now, let’s work through the following steps to serve our dummy example model:
- First of all, start Docker and make sure it is running. You can verify whether Docker is running by running the docker –version command in your operating system’s terminal. It should give you an output similar to the following:
→  TF_SERVE docker --version
Docker version 20.10.11, build dea9396
- Now, let’s pull the latest TensorFlow Docker image using the docker pull tensorflow/serving command. You should see the following...