Building data pipelines in Apache Airflow
In the previous chapter, you built your first Airflow data pipeline using a Bash and Python operator. This time, you will combine two Python operators to extract data from PostgreSQL, save it as a CSV file, then read it in and write it to an Elasticsearch index. The complete pipeline is shown in the following screenshot:
The preceding Directed Acyclic Graph (DAG) looks very simple; it is only two tasks, and you could combine the tasks into a single function. This is not a good idea. In Section 2, Deploying Pipelines into Production, you will learn about modifying your data pipelines for production. A key tenant of production pipelines is that each task should be atomic; that is, each task should be able to stand on its own. If you had a single function that read a database and inserted the results, when it fails, you have to track down whether the query failed or the insert failed. As...