Azure Data Factory is a cloud service for big data processing and analytics. It uses raw data from various data sources to create valuable insights for business decision makers, analysts, and data scientists. The following features are used to process and compose the data into data-driven workflows:
- Data pipelines: Represent a group of activities that perform a unit of work.
- Activities: One activity represents a step in a pipeline. For instance, you can create a Copy Activity to copy data from an Azure Blob Storage account to an HDInsight cluster. Azure Data Factory supports three types of activities: data movement activities, data transformation activities, and data control activities.
- Datasets: These represent the data from the data stores which are used for input and output.
- Linked services: Azure Data Factory uses linked services to connect to the data...