Understanding histograms
As was mentioned in the introductory part of this chapter, there are a concepts in computer vision that are especially important when dealing with video processing and the algorithms we'll talk about later on in this chapter. One of those concepts is histograms. Since understanding histograms is essential to understanding most of the video analysis topics, we'll go through quite a bit of information about them in this section, before moving on to the next topics. A histogram is often referred to as a way of representing the distribution of data. It is a very simple and complete description, but let's also describe what it means in terms of computer vision. In computer vision, a histogram is a graphical representation of the distribution of pixel values in an image. For example, in a grayscale image, a histogram will be a graph representing the number of pixels that contain each possible intensity in the grayscale (a value between 0 and 255). In an RGB color image...