Streams and TTL
One of the popular architectural patterns that’s used in modern applications is event-driven downstream processing. This involves performing work as a reaction to something that happened upstream. This also means working only when needed, as opposed to having “always on” systems that may spend a considerable amount of time in an idle state. Event-driven architectural patterns align well with the pay-as-you-go model of cloud computing.
Although the crux of DynamoDB usage is storing and retrieving data efficiently, real-life use cases often require more than just that. An application backed by DynamoDB may be highly performant in terms of OLTP operations, but often, there is a need to perform additional actions with the data. Some of these actions may involve tracking/auditing database activity, archiving data in a data lake, downstream processing the data into an analytics tool, or triggering actions downstream based on the changed state of data...