Putting it all together
Before we arrive at the last major section of this chapter to look at an actual case study and best practices, we felt it is helpful to put all the generative AI categories together and understand how data flows from one into another and vice-versa.
Earlier, we shared the CI/CD pipeline flow using Prompt Flow within the LLMOps construct. Now, we will take a macro look, beyond just the LLM, at how the LLM application development stack messages would flow across the generative AI ecosystem, using categories to organize the products and services.
While we do not endorse any specific services or technology, except our employer, our goal here is to show how a typical LLM flow would appear using various generative AI toolsets/products/services. We have organized each of the workloads by category, represented in the light gray boxes, along with a few of the products or services, as examples within each category. Then, we use arrows to show how typical traffic...