Technical requirements
Note
To fully understand the recipes, we make naming suggestions for the accounts, pipelines, and so on throughout the chapter. Many services, such as Azure Storage and SQL Server, require that the names you assign are unique. Follow your own preferred naming conventions, making appropriate substitutions as you follow the recipes. For the Azure resource naming rules, refer to the documentation at https://docs.microsoft.com/azure/azure-resource-manager/management/resource-name-rules.
In addition to Azure Data Factory, we shall be using three other Azure Services: Logic Apps, Blob Storage, and Azure SQL Database. You will need to have Azure Blob Storage and Azure SQL Database accounts set up to follow the recipes. The following steps describe the necessary preparation:
- Create an Azure Blob Storage account and name it
adforchestrationstorage
. When creating the storage account, select the same region (that is, East US) as you selected when you created the Data Factory instance. This will save us costs when moving data. - Create a container named
data
within this storage account, and upload two CSV files to the folder:airlines.csv
andcountries.csv
(the files can be found on GitHub: https://github.com/PacktPublishing/Azure-Data-Factory-Cookbook/tree/master/data). - Create an Azure SQL Database instance and name it
AzureSQLDatabase
. When you create the Azure SQL Database instance, you will have the option of creating a server on which the SQL database will be hosted. Create that server, and take note of the credentials you entered. You will need these credentials later when you log in to your database.Choose the basic configuration for your SQL server to save on costs. Once your instance is up and running, configure the firewall and network settings for the SQL server: make sure that you set the Allow Azure services and resources to access this database option to Yes. You may also need to create a rule to allow your IP to access the database:
Download the SQL scripts from GitHub at https://github.com/PacktPublishing/Azure-Data-Factory-Cookbook/tree/master/Chapter02/sql-scripts:
CreateAirlineTable.sql
andCreateCountryTable.sql
: These scripts will add two tables,Country
andAirline
, which are used in several recipes, including the first one.CreateMetadataTable.sql
: This will create theFileMetadata
table and a stored procedure to insert data into that table. This table is necessary for the Using Metadata and Stored Procedure activities and Filtering your data and looping through your files recipes.CreateActivityLogsTable.sql
: This will create thePipelineLog
table and a stored procedure to insert data into that table. This table is necessary for the Chaining and branching activities within your pipeline recipe.CreateEmailRecipients.sql
: This script will create theEmailRecipients
table and populate it with a record. This table is used in the Using the Lookup, Web, and Execute Pipeline activities recipe. You will need to edit it to enter email recipient information.