image comparison

    Date: 06/24/05 (Algorithms)    Keywords: no keywords

    I need to get a quantitative indication of how similar two given images are to each other.

    The existing algorithm I'm working with takes the sum of squared differences for each pixel. But this doesn't work too well if the proportion of whitespace is high -- for instance, if we have just a thin vertical line on a large white background, it would give a good correlation score with the 90 degree rotaion of itself, since most of the pixels would agree after the rotation. But to a human, the two images would register a low correlation.

    This is because, presumably, we filter out the background when we make a mental comparison. So my idea is to somehow have the program separate background hue-range from foreground, so it can ignore a large near-monochrome background. So my first question is: does this intuition make sense? If not, any ideas on how to better go about it?

    Edit: Sorry I wasn't more specific. What I'm looking at is the correlation of an image under rotation or reflection with itself. So, mathematically, testing for symmetries, where a good correlation denotes the existence of a symmetry about that transformation. And the images in question do often tend to be photographs, so edge identification is a little more difficult.

    Source: http://www.livejournal.com/community/algorithms/57326.html

« Master - Slave Replication || Algorithm of the day? »


antivirus | apache | asp | blogging | browser | bugtracking | cms | crm | css | database | ebay | ecommerce | google | hosting | html | java | jsp | linux | microsoft | mysql | offshore | offshoring | oscommerce | php | postgresql | programming | rss | security | seo | shopping | software | spam | spyware | sql | technology | templates | tracker | virus | web | xml | yahoo | home