Triggering and monitoring our pipeline
We have created our first pipeline and ran it manually in a debug environment. In real-life scenarios, we would schedule a trigger for our pipeline to run. Once the pipeline is scheduled, we want to monitor its runs such as status, row counts, and runtimes. We also want to be alerted when a run fails.
This recipe will show you how it can easily be done with Azure Data Factory. It will teach you how to perform specific actions:
- Load metadata and accumulate it into a storage account
- Schedule a trigger to run our pipeline
- Create an alert for any pipeline failure
Getting ready
This recipe assumes that you have access to an Azure subscription. It can be a free trial one, as described in the Creating an Azure subscription recipe, in Chapter 1, Getting Started with Azure and SSIS 2019. It also assumes that you have created a pipeline from the previous recipe, Moving and transforming data.
How to do it…
Let&apos...