Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Azure Data Factory Cookbook

You're reading from   Azure Data Factory Cookbook Build and manage ETL and ELT pipelines with Microsoft Azure's serverless data integration service

Arrow left icon
Product type Paperback
Published in Dec 2020
Publisher Packt
ISBN-13 9781800565296
Length 382 pages
Edition 1st Edition
Tools
Arrow right icon
Authors (4):
Arrow left icon
Dmitry Anoshin Dmitry Anoshin
Author Profile Icon Dmitry Anoshin
Dmitry Anoshin
Roman Storchak Roman Storchak
Author Profile Icon Roman Storchak
Roman Storchak
Xenia Ireton Xenia Ireton
Author Profile Icon Xenia Ireton
Xenia Ireton
Dmitry Foshin Dmitry Foshin
Author Profile Icon Dmitry Foshin
Dmitry Foshin
Arrow right icon
View More author details
Toc

Table of Contents (12) Chapters Close

Preface 1. Chapter 1: Getting Started with ADF 2. Chapter 2: Orchestration and Control Flow FREE CHAPTER 3. Chapter 3: Setting Up a Cloud Data Warehouse 4. Chapter 4: Working with Azure Data Lake 5. Chapter 5: Working with Big Data – HDInsight and Databricks 6. Chapter 6: Integration with MS SSIS 7. Chapter 7: Data Migration – Azure Data Factory and Other Cloud Services 8. Chapter 8: Working with Azure Services Integration 9. Chapter 9: Managing Deployment Processes with Azure DevOps 10. Chapter 10: Monitoring and Troubleshooting Data Pipelines 11. Other Books You May Enjoy

Technical requirements

Note

To fully understand the recipes, we make naming suggestions for the accounts, pipelines, and so on throughout the chapter. Many services, such as Azure Storage and SQL Server, require that the names you assign are unique. Follow your own preferred naming conventions, making appropriate substitutions as you follow the recipes. For the Azure resource naming rules, refer to the documentation at https://docs.microsoft.com/azure/azure-resource-manager/management/resource-name-rules.

In addition to Azure Data Factory, we shall be using three other Azure Services: Logic Apps, Blob Storage, and Azure SQL Database. You will need to have Azure Blob Storage and Azure SQL Database accounts set up to follow the recipes. The following steps describe the necessary preparation:

  • Create an Azure Blob Storage account and name it adforchestrationstorage. When creating the storage account, select the same region (that is, East US) as you selected when you created the Data Factory instance. This will save us costs when moving data.
  • Create a container named data within this storage account, and upload two CSV files to the folder: airlines.csv and countries.csv (the files can be found on GitHub: https://github.com/PacktPublishing/Azure-Data-Factory-Cookbook/tree/master/data).
  • Create an Azure SQL Database instance and name it AzureSQLDatabase. When you create the Azure SQL Database instance, you will have the option of creating a server on which the SQL database will be hosted. Create that server, and take note of the credentials you entered. You will need these credentials later when you log in to your database.

    Choose the basic configuration for your SQL server to save on costs. Once your instance is up and running, configure the firewall and network settings for the SQL server: make sure that you set the Allow Azure services and resources to access this database option to Yes. You may also need to create a rule to allow your IP to access the database:

    Figure 2.1 – Firewall configuration

Figure 2.1 – Firewall configuration

Download the SQL scripts from GitHub at https://github.com/PacktPublishing/Azure-Data-Factory-Cookbook/tree/master/Chapter02/sql-scripts:

  • CreateAirlineTable.sql and CreateCountryTable.sql: These scripts will add two tables, Country and Airline, which are used in several recipes, including the first one.
  • CreateMetadataTable.sql: This will create the FileMetadata table and a stored procedure to insert data into that table. This table is necessary for the Using Metadata and Stored Procedure activities and Filtering your data and looping through your files recipes.
  • CreateActivityLogsTable.sql: This will create the PipelineLog table and a stored procedure to insert data into that table. This table is necessary for the Chaining and branching activities within your pipeline recipe.
  • CreateEmailRecipients.sql: This script will create the EmailRecipients table and populate it with a record. This table is used in the Using the Lookup, Web, and Execute Pipeline activities recipe. You will need to edit it to enter email recipient information.
You have been reading a chapter from
Azure Data Factory Cookbook
Published in: Dec 2020
Publisher: Packt
ISBN-13: 9781800565296
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime