Using Azure Data Factory (ADF) to orchestrate the E2E pipeline
ADF is a serverless data integration and data transformation Azure service. It's a cloud Extract Transform Load (ETL)/Extract Load Transform (ELT) service in the Microsoft Azure platform. In this recipe, we will learn how to orchestrate and automate a data pipeline using ADF.
Getting ready
Before starting with this recipe, you need to ensure that you have a valid Azure subscription, valid permission to create an ADF resource, and Azure Databricks workspace details with an access token.
Note
Explaining ADF is beyond the scope of this book and readers are expected to have basic knowledge of creating an ADF pipeline and how to schedule it.
How to do it…
In this section, we learn how to invoke a Databricks notebook in an ADF pipeline and how to schedule an E2E data pipeline using an ADF trigger.
- Open an ADF workspace from the Azure portal and click on the Author & Monitor link to...