Storing log files in a remote location
By default, Airflow stores and organizes its logs in a local folder with easy access for developers, which facilitates the debugging process when something does not go as expected. However, working with larger projects or teams makes giving everyone access to an Airflow instance or server almost impracticable. Besides looking at the DAG console output, there are other ways to allow access to the logging folder without granting access to Airflow’s server.
One of the most straightforward solutions is to export logs to external storage, such as S3 or Google Cloud Storage. The good news is that Airflow already has native support to export records to cloud resources.
In this recipe, we will set a configuration in our airflow.cfg
file that allows the use of the remote logging feature and test it using an example DAG.
Getting ready
Refer to the Technical requirements section for this recipe.
AWS S3
To complete this exercise,...