Mounting an Azure Data Lake container in Databricks
Accessing data from Azure Data Lake is one of the fundamental steps of performing data processing in Databricks. In this recipe, we will learn how to mount an Azure Data Lake container in Databricks using the Databricks service principal. We will use Azure Key Vault to store the Databricks service principal ID and the Databricks service principal secret that will be used to mount a data lake container in Databricks.
Getting ready
Create a Databricks workspace and a cluster, as explained in the Configuring the Azure Databricks environment recipe of this chapter.
Create a key vault in Azure and integrate it with Azure Databricks, as explained in the Integrating Databricks with Azure Key Vault recipe.
Create an Azure Data Lake account, as explained in the Provisioning an Azure Storage account using the Azure portal recipe of Chapter 1, Creating and Managing Data in Azure Data Lake.
Go to the Azure Data Lake Storage account...