In this recipe we will show how to run a Docker container for TensorFlow Serving, a set of components to export a trained TensorFlow model and use the standard tensorflow_model_server to serve it. The TensorFlow Serving server discovers new exported models and runs a gRPC service for serving them.
Working with TensorFlow Serving and Docker
Getting ready
We will use Docker and will assume that you are familiar with the system. If not, please make sure that you have a look to https://www.docker.com/ and install it. What we are going to do is to build a version of TF Serving.