Bulk data delivery
Delivering data in bulk used to be the task of an File Transfer Protocol (FTP)/ Secure File Transfer Protocol (SFTP)/ File Transfer Protocol Secure (FTPS) outbound service that you crafted. For some older systems, this is still the case. Legacy data consumers can’t deal with cloud technology and the variations possible when reading from Azure Blob storage, or AWS S3 buckets. For these consumers, an FTP facility is needed to push datasets out by FTP and/or post files for pickup. For both flavors (push or pull files), you will need to design a notifications capability so that the consumer is made aware of delivery or told that one is ready for pick up. You will also need to account for communication errors, and retransmissions with the potential that an SLA is exceeded in the worst case. The FTP facility works like a workflow manager, which requires constant monitoring and issue remediation tasks to get spun up automatically. You’ll also need to create...