开发者

Crop circular or elliptical image from original UIImage

I am working on openCV for detecting the face .I want face to get cropped once its detected.Till now I got the face and have marked the rect/ellipse around it on iPhone.

Please help me out in cropping the face in circular/elliptical pattern

 (UIImage *) opencvFaceDetect:(UIImage *)originalImage 
 {

cvSetErrMode(CV_ErrModeParent);

IplImage *image = [self CreateIplImageFromUIImage:originalImage];

// Scaling down

/*
Creates IPL image (header and data) ----------------cvCreateImage
CVAPI(IplImage*)  cvCreateImage( CvSize size, int depth, int channels );
*/

IplImage *small_image = cvCreateImage(cvSize(image->width/2,image->height/2),
    IPL_DEPTH_8U, 3);

/*SMOOTHES DOWN THYE GUASSIAN SURFACE--------:cvPyrDown*/
cvPyrDown(image, small_image, CV_GAUSSIAN_5x5);
int scale = 2;

// Load XML
NSString *path = [[NSBundle mainBundle] pathForResource:@"haarcascade_frontalface_default" ofType:@"xml"];
CvHaarClassifierCascade* cascade = (CvHaarClassifierCascade*)cvLoad([path cStringUsingEncoding:NSASCIIStringEncoding], NULL, NULL, NULL);

// Check whether the cascade has loaded successfully. Else report and error and quit

if( !cascade )
{
    NSLog(@"ERROR: Could not load classifier cascade\n");
    //return;
}

//Allocate the Memory storage
CvMemStorage* storage = cvCreateMemStorage(0);

// Clear the memory storage which was used before
cvClearMemStorage( storage );

CGColorSpaceRef colorSpace;
CGContextRef contextRef;


CGRect face_rect;
// Find whether the cascade is loaded, to find the faces. If yes, then:
if( cascade )
{
CvSeq* faces = cvHaarDetectObjects(small_image, cascade, storage, 1.1f, 3, 0, cvSize(20, 20));
cvReleaseImage(&small_image);

// Create canvas to show the results
 CGImageRef imageRef = originalImage.CGImage;
 colorSpace = CGColorSpaceCreateDeviceRGB();
 contextRef = CGBitmapContextCreate(NULL, originalImage.size.width, originalImage.size.height, 8, originalImage.size.width * 4,
                                                colorSpace, kCGImageAlphaPremultipliedLast|kCGBitmapByteOrderDefault);
//VIKAS
CGContextDrawImage(contextRef, CGRectMake(0, 0, originalImage.size.width, originalImage.size.height), imageRef);



CGContextSetLineWidth(contextRef, 4);
CGContextSetRGBStrokeColor(contextRef, 1.0, 1.0, 1.0, 0.5);




// Draw results on the iamge:Draw all components of face in the form of small rectangles

// Loop the number of faces found.

for(int i = 0; i < faces->total; i++) 
    {
    NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];

    // Calc the rect of faces
    // Create a new rectangle for drawing the face

    CvRect cvrect = *(CvRect*)cvGetSeqElem(faces, i);
    //  CGRect face_rect = CGContextConvertRectToDeviceSpace(contextRef, 
    //                          CGRectMake(cvrect.x * scale, cvrect.y * scale, cvrect.width * scale, cvrect.height * scale));


     face_rect = CGContextConvertRectToDeviceSpace(contextRef, 
                                                         CGRectMake(cvrect.x*scale, cvrect.y , cvrect.width*scale , cvrect.height*scale*1.25
                                                                    ));

    facedetectapp=(FaceDetectAppDelegate *)[[UIApplication sharedApplication]delegate];
    facedetectapp.grabcropcoordrect=face_rect;

    NSLog(@"  FACE off %f %f %f %f",facedetectapp.grabcropcoordrect.origin.x,facedetectapp.grabcropcoordrect.origin.y,facedetectapp.grabcropcoordrect.size.width,facedetectapp.grabcropcoordrect.size.height);
    CGContextStrokeRect(contextRef, face_rect);
        //CGContextFillEllipseInRect(contextRef,face_rect);
    CGContextStrokeEllipseInRect(contextRef,face_rect);


    [pool release];
}

}
CGImageRef imageRef = CGImageCreateWithImageInRect([originalImage CGImage],face_rect);
    UIImage *returnImage = [UIImage imageWithCGImage:imageRef];
    CGImageRelease(imageRef);


CGContextRelease(contextRef);
CGColorSpaceRelease(colorSpace);

cvReleaseMemStorage(&storage);
cvReleaseHaarClassifierCascade(&cascade);

   return returnImage;
}


}

Than开发者_StackOverflow中文版ks Vikas


There are a pile of blend modes to choose from, a few of which are useful for "masking". I believe this should do approximately what you want:

CGContextSaveGState(contextRef);
CGContextSetBlendMode(contextRef,kCGBlendModeDestinationIn);
CGContextFillEllipseInRect(contextRef,face_rect);
CGContextRestoreGState(contextRef);

"approximately" because it'll mask the entire context contents every time, thus doing the wrong thing for more than one face. To handle this case, use CGContextAddEllipseInRect() in the loop and CGContextFillPath() at the end.

You might also want to look at CGContextBeginTransparencyLayerWithRect().


Following is the answer I given in How to crop UIImage on oval shape or circle shape? to make the image circle. It works for me..

Download the Support archive file from URL http://vocaro.com/trevor/blog/2009/10/12/resize-a-uiimage-the-right-way/

#import "UIImage+RoundedCorner.h"
#import "UIImage+Resize.h"

Following lines used to resize the image and convert in to round with radius

UIImage *mask = [UIImage imageNamed:@"mask.jpg"];

mask = [mask resizedImage:CGSizeMake(47, 47) interpolationQuality:kCGInterpolationHigh ];
mask = [mask roundedCornerImage:23.5 borderSize:1];

Hope it helps some one..

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜