Defining ingest-dependent DAGs
In the data world, considerable discussion exists about how to organize Airflow DAGs. The approach I generally use is to create a DAG for a specific pipeline based on the business logic or final destination. Nevertheless, sometimes, to proceed with a task inside a DAG, we depend on another DAG to finish the process and get the output.
In this recipe, we will create two DAGs, where the first depends on the result of the second to be successful. Otherwise, it will not be completed. To assist us, we will use the ExternalTaskSensor
operator.
Getting ready
Please refer to the Getting ready section of the Configuring Airflow recipe for this recipe since we will handle it with the same technology.
This recipe depends on the holiday_ingest
DAG, created in the Creating custom operators recipe, so ensure you have that.
We will not explicitly cite the imports and main DAG configuration to prevent redundancy and repetition in this exercise. The...