Creating connectors in Airflow
Having DAGs and operators without connecting to any external source is useless. Of course, there are many ways to ingest files, even from other DAGs or task results. Still, data ingestion usually involves using external sources such as APIs or databases as the first step of a data pipeline.
To make this happen, in this recipe, we will understand how to create a connector in Airflow to connect to a sample database.
Getting ready
Refer to the Getting ready section of the Configuring Airflow recipe for this recipe since we will handle it with the same technology.
This exercise will also require the MongoDB local database to be up and running. Ensure you have configured it as seen in Chapter 1 and have at least one database and collection. You can use the instructions from the Connecting to a NoSQL database (MongoDB) recipe in Chapter 5.
How to do it…
Here are the steps to perform this recipe:
- Let’s start by opening...