Chapter 6. Going with the Flow with Akka Streams
The concept of data processing, that is, taking data from point A to point B with all kinds of transforms in between, is not a new concept in the world of programming. It's the basis of the age old processes of Extract, Transform, Load (ETL) and, to a lesser extent, Electronic Data Interchange (EDI). For decades now, programmers have had to produce, transform, and then consume data between two systems. So, if this isn't a new problem, why would anyone need a new paradigm to handle this age-old process? What's changed in this day and age to cause us to have to rethink data processing?
The biggest change is the volume of data that's out there and available to process now. We're living in the age of big data now, and there are massive amounts of data that people want to consume and make use of within their apps, one such example being the Twitter Firehose. These data sets are very large, way too big to try and...