开发者

AVCaptureSession output sample buffer save to CoreData

I am using an AVCaptureSession to capture frames from the camera using the setSampleBufferDelegate method from the AVCaptureVideoDataOutput class. The delegate method looks like the following. You can see that I convert to a UIImage and place it in a UIImageView. I would like to save each UIImage to disk and store the URL in a new managedObject but I don't know how to properly get a managedObjectContext since each call spawns a new thread using a serial dispatch queue. Can anyone suggest a solution to use CoreData and the dispatch queues in a way that I could build a collection of images that are stored on disk, and correspond to a managedObject.

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];

CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
/*Lock the image buffer*/
CVPixelBufferLockBaseAddress(imageBuffer,0); 
/*Get information about the image*/
uint8_t *baseAddress = (开发者_JS百科uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
size_t width = CVPixelBufferGetWidth(imageBuffer); 
size_t height = CVPixelBufferGetHeight(imageBuffer);  

/*Create a CGImageRef from the CVImageBufferRef*/
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext); 

    /*We release some components*/
    CGContextRelease(newContext); 
    CGColorSpaceRelease(colorSpace);

/*We display the result on the image view (We need to change the orientation of the image so that the video is displayed correctly).
     Same thing as for the CALayer we are not in the main thread so ...*/
    UIImage *image= [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationRight];

/*We relase the CGImageRef*/
CGImageRelease(newImage);

[self.imageView performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES];

/*We unlock the  image buffer*/
CVPixelBufferUnlockBaseAddress(imageBuffer,0);

[pool drain];
}


The recommended solution is to create a new NSManagedObjectContext for each thread, each pointing to a single NSPersistentStoreCoordinator. You may also want to listen for NSManagedObjectContextDidSaveNotification, to merge the changes into the main thread's context (using the aptly-named mergeChangesFromContextDidSaveNotification:).

Personally, I like to use an accessor like this in a central place to handle the per-thread contexts:

- (NSManagedObjectContext *) managedObjectContext {
    NSManagedObjectContext *context = [[[NSThread currentThread] threadDictionary] objectForKey:@"NSManagedObjectContext"];
    if (context == nil) {
        context = [[[NSManagedObjectContext alloc] init] autorelease];
        [context setPersistentStoreCoordinator:self.persistentStoreCoordinator];
        [[[NSThread currentThread] threadDictionary] setObject:context forKey:@"NSManagedObjectContext"];
    }
    return context;
}

Do remember that you cannot pass NSManagedObjects between threads any easier than you can pass contexts. Instead, you must pass an NSManagedObjectID (from the object's objectID property), and then in the destination thread use that thread's context's objectWithID: method to get back an equivalent object.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜