Summary
In this chapter, we looked at a critical part of a data engineers' job: designing and orchestrating data pipelines. First, we examined some of the core concepts around data pipelines, such as scheduled and event-based pipelines, and how to handle failures and retries.
We then looked at four different AWS services that can be used for creating and orchestrating data pipelines. This included Amazon Data Pipeline, AWS Glue Workflows, Amazon Managed Workflows for Apache Airflow (MWAA), and AWS Step Function. We discussed some of the use cases for each of these services, as well as the pros and cons of them.
Then, in the hands-on section of this chapter, we built an event-driven pipeline. We used two AWS Lambda functions for processing and an Amazon SNS topic for sending out notifications about failure. Then, we put these pieces of our data pipeline together into a state machine orchestrated by AWS Step Function. We also looked at how to handle errors.
So far, we have...