开发者

Layer OpenGL Content Over AVCaptureSession CVImageBufferRef from Camera

I have two successful concepts that I want to now merge. I am successful in layering开发者_如何学Python a CATextLayer onto a CVImageBufferRef camera frame and then saving it via an AVAssetWriter using an AVAssetWriterInputPixelBufferAdaptor. I do that like this:

- (void) processNewBuffer:(CVImageBufferRef)cameraFrame {
    NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];

    //    
    //    //update CALayer on main queue
    //    //UIKit is not thread safe
    dispatch_sync(dispatch_get_main_queue(), ^{ 
        [self updateCALayer]; 
    });

    if(recorder.recording) {

        CVPixelBufferLockBaseAddress(cameraFrame,0); 

        // do stuff with buffer here
        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(cameraFrame); 
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(cameraFrame); 
        width = CVPixelBufferGetWidth(cameraFrame); 
        height = CVPixelBufferGetHeight(cameraFrame); 

        /*Create a CGImageRef from the CVImageBufferRef*/
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
        CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 

        [textLayer renderInContext:newContext];

        [recorder appendPixelBuffer:cameraFrame withPresentationTime:camera.lastSampleTime];

        CVPixelBufferUnlockBaseAddress(cameraFrame,0);

        /*We release some components*/
        CGContextRelease(newContext); 
        CGColorSpaceRelease(colorSpace);
    }

    [pool drain]; 
}

This works like a charm. Now for my second trick. Thanks to the answer to this question:

CMSampleBuffer from OpenGL for video output with AVAssestWritter

I can modify the OpenGL Teapot example from the WWDC 2010 sample code and save the rendered content to a movie file on the iPhone.

NOW, what I want is the ability to layer the teapot in one corner of the camera frame, and save the bundle to a movie. The problem that I am having is basic C stuff. How do I copy from one buffer to the next, when one buffer is 1280x720 (the camera frame) and the teapot is in a buffer that is 320x320. The other consideration is speed. In order to process 30fps, I can't be moving in and out of either CGImageRef or UIImage classes. This has to happen as fast as possible. What is the best way to accomplish this?


You could try using the SpriteKit Framework ( from iOS7 ) that works on OpenGL and should keep high frame rates, working on images/texture.

Start looking on Apple introduction: https://developer.apple.com/library/ios/documentation/GraphicsAnimation/Conceptual/SpriteKit_PG/Introduction/Introduction.html#//apple_ref/doc/uid/TP40013043-CH1-SW1

Hope it helps

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜