Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Azure Data Factory Cookbook

You're reading from   Azure Data Factory Cookbook Build ETL, Hybrid ETL, and ELT pipelines using ADF, Synapse Analytics, Fabric and Databricks

Arrow left icon
Product type Paperback
Published in Feb 2024
Publisher Packt
ISBN-13 9781803246598
Length 532 pages
Edition 2nd Edition
Tools
Arrow right icon
Authors (4):
Arrow left icon
Tonya Chernyshova Tonya Chernyshova
Author Profile Icon Tonya Chernyshova
Tonya Chernyshova
Xenia Ireton Xenia Ireton
Author Profile Icon Xenia Ireton
Xenia Ireton
Dmitry Foshin Dmitry Foshin
Author Profile Icon Dmitry Foshin
Dmitry Foshin
Dmitry Anoshin Dmitry Anoshin
Author Profile Icon Dmitry Anoshin
Dmitry Anoshin
Arrow right icon
View More author details
Toc

Table of Contents (15) Chapters Close

Preface 1. Getting Started with ADF 2. Orchestration and Control Flow FREE CHAPTER 3. Setting Up Synapse Analytics 4. Working with Data Lake and Spark Pools 5. Working with Big Data and Databricks 6. Data Migration – Azure Data Factory and Other Cloud Services 7. Extending Azure Data Factory with Logic Apps and Azure Functions 8. Microsoft Fabric and Power BI, Azure ML, and Cognitive Services 9. Managing Deployment Processes with Azure DevOps 10. Monitoring and Troubleshooting Data Pipelines 11. Working with Azure Data Explorer 12. The Best Practices of Working with ADF 13. Other Books You May Enjoy
14. Index

Creating an ADF pipeline using the Copy Data tool

We just reviewed how to create the ADF job using the UI. However, we can also use the Copy Data tool (CDT). The CDT allows us to load data into Azure storage faster. We don’t need to set up linked services, pipelines, and datasets as we did in the previous recipe. In other words, depending on your activity, you can use the ADF UI or the CDT. Usually, we will use the CDT for simple load operations, when we have lots of data files and we would like to ingest them into Data Lake as fast as possible.

Getting ready

In this recipe, we will use the CDT in order to do the same task of copying data from one folder to another.

How to do it...

We already created the ADF job with the UI. Let’s review the CDT:

  1. In the previous recipe, we created the Azure Blob storage instance and container. We will use the same file and the same container. However, we have to delete the file from the output location.
  2. ...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image