Large-scale image processing using Hadoop
We have already mentioned in the earlier chapters how the size and volume of images are increasing day by day; the need to store and process these vast amount of images is difficult for centralized computers. Let's consider an example to get a practical idea of such situations. Let's take a large-scale image of size 81025 pixels by 86273 pixels. Each pixel is composed of three values:red, green, and blue. Consider that, to store each of these values, a 32-bit precision floating point number is required. Therefore, the total memory consumption of that image can be calculated as follows:
86273 * 81025 * 3 * 32 bits = 78.12 GB
Leave aside doing any post processing on this image, as it can be clearly concluded that it is impossible for a traditional computer to even store this amount of data in its main memory. Even though some advanced computers come with higher configurations, given the return on investment, most companies do not opt for these computers...