开发者

crop image from certain portion of screen in iphone programmatically

NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init]; CGSize contextSize=CGSizeMake(320,400); UIGraphicsBeginImageContext(self.view.bounds.size);

UIGraphicsBeginImageContext(contextSize);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()]; 
UIImage *savedImg = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self setSaveImage:savedImg];

to extarct some part of image from main screen.

In UIGraphicsBeginImageContext I can only use size, is there any way to use CGRect or some other way to extract image from a specific portion of screen ie (x,y, 320, 400) some thing like开发者_JAVA百科 this


Hope this helps:

// Create new image context (retina safe)
UIGraphicsBeginImageContextWithOptions(size, NO, 0.0);

// Create rect for image
CGRect rect = CGRectMake(x, y, size.width, size.height);

// Draw the image into the rect
[existingImage drawInRect:rect];

// Saving the image, ending image context
UIImage * newImage = UIGraphicsGetImageFromCurrentImageContext();

UIGraphicsEndImageContext();


This question is really a duplicate of several other questions including this: How to crop the UIImage?, but since it took me a while to find a solution, I will cross post again.

In my quest for a solution that I could more easily understand (and written in Swift), I arrived at this:


I wanted to be able to crop from a region based on an aspect ratio, and scale to a size based on a outer bounding extent. Here is my variation:

import AVFoundation
import ImageIO

class Image {

    class func crop(image:UIImage, crop source:CGRect, aspect:CGSize, outputExtent:CGSize) -> UIImage {

        let sourceRect = AVMakeRectWithAspectRatioInsideRect(aspect, source)
        let targetRect = AVMakeRectWithAspectRatioInsideRect(aspect, CGRect(origin: CGPointZero, size: outputExtent))

        let opaque = true, deviceScale:CGFloat = 0.0 // use scale of device's main screen
        UIGraphicsBeginImageContextWithOptions(targetRect.size, opaque, deviceScale)

        let scale = max(
            targetRect.size.width / sourceRect.size.width,
            targetRect.size.height / sourceRect.size.height)

        let drawRect = CGRect(origin: -sourceRect.origin * scale, size: image.size * scale)
        image.drawInRect(drawRect)

        let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
        UIGraphicsEndImageContext()

        return scaledImage
    }
}

There are a couple things that I found confusing, the separate concerns of cropping and resizing. Cropping is handled with the origin of the rect that you pass to drawInRect, and scaling is handled by the size portion. In my case, I needed to relate the size of the cropping rect on the source, to my output rect of the same aspect ratio. The scale factor is then output / input, and this needs to be applied to the drawRect (passed to drawInRect).

One caveat is that this approach effectively assumes that the image you are drawing is larger than the image context. I have not tested this, but I think you can use this code to handle cropping / zooming, but explicitly defining the scale parameter to be the aforementioned scale parameter. By default, UIKit applies a multiplier based on the screen resolution.

Finally, it should be noted that this UIKit approach is higher level than CoreGraphics / Quartz and Core Image approaches, and seems to handle image orientation issues. It is also worth mentioning that it is pretty fast, second to ImageIO, according to this post here: http://nshipster.com/image-resizing/

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜