Services, Policies, and User Input
Having understood the general outline of the inferencing pipeline, let’s delve into the details of containers, services, and policies as applicable to this example application.
Invariably, in a practical deployment of a containerized application, you are expected to provide external inputs to set up and customize the application. Open Horizon provides this flexibility using a User Input file that works with the service definition of each of the services. This is somewhat of a new concept in this book, and we will go into the details of User Input later in this section.
Services
This application has three containerized services to build the inferencing pipeline – infer
, http
, and mms
. Each service is deployed using its own container image. We will introduce each of these three services and describe them in detail later when they are deployed and used.
infer
The infer
service is the main service of the inferencing pipeline...