Serverless Queues
One of the most useful components in scaling a serverless architecture is the queue. Queues are a key asynchronous processing concept. In a queue, data points are added to the back of the queue and taken from the front. An example of this is a first-in-first-out data structure. In a cloud architecture, a queue will generally store events or messages from an upstream process until subscribers downstream have the time to process them.
An important thing to understand before deciding to use a queue is the compromise you are making in using one. In a synchronous operation (that is, inserting a record into a database all in one operation), if something goes wrong, you have instant feedback to your user. In an asynchronous operation (that is, dropping an event onto a queue that a worker later picks up to insert into the same database), if the later operations go wrong, then the user has usually left the application and cannot fix their mistake or call support.
A good way to try...