Exercise – deploying Cloud Composer jobs using Cloud Build
In this section, we will continue creating a Cloud Build pipeline. This time, I will help you get an idea of how this practice can be implemented in terms of data engineering. To do that, we will try to create a CI/CD pipeline for deploying a Cloud Composer DAG.
In this exercise, we will use the DAG from Chapter 4, Building Orchestration for Batch Data Loading Using Cloud Composer. Let's refresh a little bit on the exercises from that chapter.
In Chapter 4, Building Orchestration for Batch Data Loading Using Cloud Composer, we learned how Cloud Composer works. We learned that in Cloud Composer, you can develop DAGs to create data pipelines. These data pipelines can use Airflow Operators to manage BigQuery, Cloud SQL, GCS, or simple bash scripts. In those exercises, we practiced five levels of DAGs, with the level one DAG being the simplest one and the level five DAG being the most complex. To deploy a DAG...