Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Azure Data Engineering Cookbook

You're reading from   Azure Data Engineering Cookbook Design and implement batch and streaming analytics using Azure Cloud Services

Arrow left icon
Product type Paperback
Published in Apr 2021
Publisher Packt
ISBN-13 9781800206557
Length 454 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (2):
Arrow left icon
Nagaraj Venkatesan Nagaraj Venkatesan
Author Profile Icon Nagaraj Venkatesan
Nagaraj Venkatesan
Ahmad Osama Ahmad Osama
Author Profile Icon Ahmad Osama
Ahmad Osama
Arrow right icon
View More author details
Toc

Table of Contents (11) Chapters Close

Preface 1. Chapter 1: Working with Azure Blob Storage 2. Chapter 2: Working with Relational Databases in Azure FREE CHAPTER 3. Chapter 3: Analyzing Data with Azure Synapse Analytics 4. Chapter 4: Control Flow Activities in Azure Data Factory 5. Chapter 5: Control Flow Transformation and the Copy Data Activity in Azure Data Factory 6. Chapter 6: Data Flows in Azure Data Factory 7. Chapter 7: Azure Data Factory Integration Runtime 8. Chapter 8: Deploying Azure Data Factory Pipelines 9. Chapter 9: Batch and Streaming Data Processing with Azure Databricks 10. Other Books You May Enjoy

Copying data from Azure Data Lake Gen2 to an Azure Synapse SQL pool using the copy activity

The copy activity, as the name suggests, is used to copy data quickly from a source to a destination. In this recipe, we'll learn how to use the copy activity to copy data from Azure Data Lake Gen2 to an Azure Synapse SQL pool.

Getting ready

Before you start, do the following:

  1. Log in to Azure from PowerShell. To do this, execute the Connect-AzAccount command and follow the instructions to log in to Azure.
  2. Open https://portal.azure.com and log in using your Azure credentials.

How to do it…

Follow the given steps to perform the activity:

  1. The first step is to create a new Azure Data Lake Gen2 storage account and upload the data. To create the storage account and upload the data, execute the following PowerShell command:
    .\ADE\azure-data-engineering-cookbook\Chapter04\1_UploadOrderstoDataLake.ps1 -resourcegroupname packtade -storageaccountname packtdatalakestore...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image