Creating an ADF pipeline by using the Copy Data tool
We just reviewed how to create the ADF job using UI. However, we can also use the Copy Data tool (CDT). The CDT allows us to load data into Azure storage faster. We don't need to set up linked services, pipelines, and datasets as we did in the previous recipe. In other words, depending on your activity, you can use the ADF UI or the CDT. Usually, we will use the CDT for simple load operations, when we have lots of data files and we would like to ingest them into Data Lake as fast as possible.
Getting ready
In this recipe, we will use the CDT in order to do the same task of copying data from one folder to another.
How to do it...
We created the ADF job with the UI. Let's review the CDT:
- In the previous recipe, we created the Azure Blob storage instance and container. We will use the same file and the same container. However, we have to delete the file from the output location.
- Go to Azure Storage...