Using the Lookup, Web, and Execute Pipeline activities
In this recipe, we shall implement error handling logic for our pipeline – similar to the previous recipe, but with a more sophisticated design: we shall isolate the error handling flow in its own pipeline. Our main parent pipeline will then call the child pipeline. This recipe also introduces three very useful activities to the user: Lookup, Web, and Execute Pipeline. The recipe will illustrate how to retrieve information from an Azure SQL table and how to invoke other Azure services from the pipeline.
Getting ready
We shall be using all the Azure services that are mentioned in the Technical requirements section at the beginning of the chapter. In addition, this recipe requires a table to store the email addresses of the status email recipients. Please refer to the Technical requirements section for the table creation scripts and instructions.
We shall be building a pipeline that sends an email in the case of failure. There is no activity in ADF capable of sending emails, so we shall be using the Azure Logic Apps service. Follow these steps to create an instance of this service:
- In the Azure portal, look for Logic Apps in Azure services. Then, use the Add button to create a new logic app.
- Name your logic app
ADF-Email-LogicApp
and fill in the Subscription, Resource Group, and Region information fields. Click on Create and wait until your logic app is deployed. Then, click on Go to Resource. - In the Logic App Designer, select the When a HTTP request is received trigger:
- In the displayed tile, click on Use sample payload to generate schema, and use the following code block:
{ "subject": "<subject of the email message>", "messageBody": "<body of the email message >", "emailAddress": "<email-address>" }
Enter the text in the textbox as shown in the following figure:
- Click on the Next Step button and choose the email service that you want to use to send the notification emails. For the purposes of this tutorial, we shall use Gmail.
Note:
Even though we use Gmail for the purposes of this tutorial, you can also send emails using Office 365 Outlook or Outlook.com. In the See also section of this recipe, we included a link to a tutorial on how to send emails using those providers.
- Select Gmail from the list of services and Send Email from Actions. Log in with your account credentials:
- From the Add new parameter dropdown, check the Subject and Body checkboxes:
- Place your cursor inside the To text field and enter
@{triggerBody()['emailAddress']}
. - In a similar way, enter
@{triggerBody()['subject']}
in the Subject text field. - Finally, in the Body text box, enter
@{triggerBody()['messageBody']}
: - Save your logic app. In the first tile, you should see that HTTP POST URL was populated. This is the URL we'll use to invoke this logic app from the Data Factory pipeline.
How to do it…
First, we shall create the child pipeline to retrieve the email addresses of the email recipients and send the status email:
- Create a new pipeline and name it
pl_orchestration_recipe_5_child
. - From the Activities pane, select a Lookup activity and add it to the pipeline canvas. Configure it in the following way:
(a) In the General tab, change the activity name to
Get Email Recipients
.(b) In the Settings tab, select AzureSQLTables as the value for Source dataset, and specify EmailRecipients for tableName.
(c) Also, in the Settings tab, select the Use Query radio button and enter
SELECT * FROM EmailRecipients
into the text box. Make sure to uncheck the First row only checkbox at the bottom. Your Settings tab should look similar to the following figure: - Next, add a ForEach activity to the canvas and configure it in the following way:
In the Settings tab, enter
@activity('Get Email Recipients').output.value
into the Items textbox. - Click on the pencil icon within the ForEach activity. This will open a new canvas. Add a Web activity onto this canvas.
We shall now configure the Web activity. First, go to the General tab, and rename it
Send Email
. Then, in the URL text field, paste the URL for the logic app (which you created in the Getting ready section).In the Method textbox, select POST.
In the Headers section, click on the New button to add a header. Enter
Content-Type
into the Name text box andapplication/json
into the Value textbox.In the Body text box, enter the following text (be sure to copy the quotes accurately):
@json(concat('{"emailAddress": "', item().emailAddress, '", "subject": "ADF Pipeline Failure", "messageBody": "ADF Pipeline Failed"}'))
Your Settings tab should look similar to Figure 2.34:
- Run this pipeline in Debug mode and verify that it works. You should have some test email addresses in the
EmailRecipients
table in order to test your pipeline. You can also verify that the email was sent out by going to the ADF-Email-LogicApp UI in the Azure portal and examining the run in the Overview pane: - We are ready to design the parent pipeline, which will invoke the child pipeline we just tested. For this, clone the pipeline we designed in the Chaining and branching activities within your pipeline recipe. Rename your clone
pl_orchestration_recipe_5_parent
. - In this pipeline, delete the On Failure Stored Procedure activity, and instead add an Execute Pipeline activity to the canvas. Connect it to the red square in the Copy From Blob to Azure SQL activity.
- Configure the Execute Pipeline activity:
In the General tab, change the name to Send Email On Failure.
In the Settings tab, specify the name of the invoked pipeline as
pl_orchestration_recipe_5_child
. - The parent pipeline should already be configured with the incorrect table name in the Copy activity sink (we deliberately misconfigured it in order to test the On Failure flow). Verify that this is still the case and run the pipeline in Debug mode:
- Verify that the email was sent to the recipients.
- Publish your changes to save them.
How it works…
In this recipe, we introduced the concept of parent and child pipelines and used the pipeline hierarchy to incorporate error handling functionality. This technique offers several benefits:
- It allows us to reuse existing pipelines.
- It makes it easier to design/debug parts of the pipeline separately.
- Finally, it allows users to design pipelines that contain more than 40 activities (Microsoft limits the number of activities per pipeline).
To craft the child pipeline, we started by adding a Lookup activity to retrieve a list of email recipients from the database table. This is a very common use for the Lookup activity: fetching a dataset for subsequent processing. In the configuration, we specified a query for the dataset retrieval: SELECT * from EmailRecipients
. We can also use a more sophisticated query to filter the email recipients, or we can retrieve all the data by selecting the Table radio button. The ability to specify a query gives users a lot of choice and flexibility in filtering a dataset or using field projections with very little effort.
The list of email recipients was processed by the ForEach activity. We encountered the ForEach activity in the previous recipe. However, inside the ForEach activity, we introduced a new kind of activity: the Web activity, which we configured to invoke a simple logic app. This illustrates the power of the Web activity: it enables the user to invoke external REST APIs without leaving the Data Factory pipeline.
There's more…
There is another ADF activity that offers the user an option to integrate external APIs into a pipeline: the Webhook activity. It has a lot of similarities to the Web activity, with two major differences:
- The Webhook activity always passes an implicit
callBackUri
property to the external service, along with the other parameters you specify in the request body. It expects to receive a response from the invoked web application. If the response is not received within the configurable timeout period, the Webhook activity fails. The Web activity does not have acallBackUri
property, and, while it does have a timeout period, it is not configurable but limited to 1 minute.This feature of the Webhook activity can be used to control the execution flow of the pipeline – for example, to wait for user input into a web form before proceeding with further steps.
- The Web activity allows users to pass linked services and datasets. This can be used for data movement to a remote endpoint. The Webhook activity does not offer this capability.
See also
For more information about the Webhook activity, refer to the Microsoft documentation:
https://docs.microsoft.com/azure/data-factory/control-flow-webhook-activity
If you want to learn how to configure a logic app to send emails using providers other than Gmail, follow this tutorial:
https://docs.microsoft.com/azure/logic-apps/tutorial-process-email-attachments-workflow