Working in IoT generally means working with large data. Sensors can be verbose and the devices can be large as well. The CERN's particle accelerator, for example, generates over a petabyte a second. Sending this raw data to a central repository would be impractical. Many companies facing extremely large datasets or extremely fast-moving datasets can face challenges when it comes to dealing with their data.
In this recipe, we are going to distribute a workload across several systems, thereby allowing one system to take an image and another to process it. A small device, in our example, could take the image and stream it to an industrial PC or a set of servers in a factory. We are going to use docker and docker-compose here, while for our algorithm, we are going to use YOLO's (an image classification algorithm) OpenCV implementation.