What is a data pipeline?
A data pipeline is a series of tasks, such as transformations, filters, aggregations, and merging multiple sources, before outputting the processed data into some target. In layman’s terms, a data pipeline gets data from the “source” to the “target,” as depicted in the following diagram:
Figure 2.1: A sample ETL process illustration
You can think of pipelines as transport tubes in a mailroom. Mail can be placed in specific tubes and sucked up to specific processing centers. Based on specific labels, the mail is then moved and sorted into specific pathways that eventually bring it to its destination. The core concept of data pipelines is quite similar. Like mail, packets of raw data are ingested into the entry of the pipeline and, through a series of steps and processes, the raw material is formatted and packaged into an output location, which is most commonly used for storage.
From a business...