开发者

iOS: compare a slice of an image to library of options

I'm basically trying to work out how to take a slice of an image, say a screenshot of an iPhone home screen, slice out the first icon and compare it 开发者_如何学Cto a set array of images in a library. Any help on where to start?


I'm no iPhone programmer, but I might be able to suggest a few things:

  • The SURF feature detection implemented in OpenCV should help you with this
  • There is a nice article on using OpenCV in Objective-C code.

A quick & dirty way might be to use the difference blend mode which should return the difference between the 1st image(top) and the 2nd image(bottom). If there is no difference the result will be completely black. So, the more black pixels in the difference result, potentially, the more similarities between the compared images.

I'm not an iOS developer, so I don't know if there is an image library that ships with sdk or if there's a free/opensource library for basic image processing. Still this should be trivial to implement:

e.g.

- (int)difference((int)topPixel,(int)bottomPixel)
{
    return abs(topPixel-bottomPixel);
}

Note: Syntax might not be correct :)

HTH


This may not help you with taking a screenshot of the iOS home screen... But these articles show how to take snapshots from within a UIKit application:

https://developer.apple.com/library/prerelease/ios/#qa/qa1703/_index.html

https://developer.apple.com/library/prerelease/ios/#qa/qa1714/_index.html

Perhaps you would instruct the user to press home-power (buttons) to take a snapshot and store in the photo roll, then load that screenshot into an app to process the screenshot.

Hope this helps!

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜