Image similarity search
With the emergence of the Internet of Things and autonomous robots, the ability to compare an image from the environment to others from our database is fundamental for understanding its context. Imagine an autonomous car or a drone with a camera, looking for something like a red bag or an advertisement. This kind of task requires an image search without any kind of metadata associated. Due to this, you don't look for equality; instead, you look for similarity.
While comparing two or more images, the first question that comes to our mind is what makes an image similar to another? We can say that one image is equal to another if all their pixels match. However, a small change in the light, angle, or rotation of the camera represents a big change in the numerical values of the pixels. Finding ways to define whether two images are similar is the main concern of services such as Google Search by Image or TinEye, where the user uploads an image instead of providing keywords...