Creating parallel ingest tasks
When working with data, we hardly ever just perform one ingestion at a time, and a real-world project involves many ingestions happening simultaneously, often in parallel. We know scheduling two or more DAGs to run alongside each other is possible, but what about tasks inside one DAG?
This recipe will illustrate how to create parallel task execution in Airflow.
Getting ready
Please refer to the Getting ready section of the Configuring Airflow recipe for this recipe since we will handle it with the same technology.
To avoid redundancy in this exercise, we won’t explicitly include the imports and main DAG configuration. Instead, the focus is on organizing the operator’s workflow. You can use the same logic to create your DAG structure as in the Creating DAGs recipe.
For the complete Python file used here, go to the GitHub page here: https://github.com/PacktPublishing/Data-Ingestion-with-Python-Cookbook/tree/main/Chapter_9/creating_parallel_ingest_tasks...