开发者

iphone . processing frames that are being recorded by the camera

I have this app the records video and I need to fire a method every time a frame is grabbed. After banging my head on the wall, I decided to try the following: create a dispatch queue, as I would grab a video from the output, just to have a method called when the frame is recorded by the camera.

I am trying to understand a section of code created by Apple to record videos to figure out how I should add the dispatch queue. This is the apple code and the section marked between asterisks is what I have added, in order to create the queue. It compiles without errors, but captureOutput: didOutputSampleBuffer: fromConnection: is never called.

- (BOOL) setupSessionWithPreset:(NSString *)sessionPreset error:(NSError **)error
{
    BOOL success = NO;

    // Init the device inputs
    AVCaptureDeviceInput *videoInput = [[[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:error] autorelease];
    [self setVideoInput:videoInput]; // stash this for later use if we need to switch cameras

    AVCaptureDeviceInput *audioInput = [[[AVCaptureDeviceInput alloc] initWithDevice:[self audioDevice] error:error] autorelease];
    [self setAudioInput:audioInput];

    AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
    [self setMovieFileOutput:movieFileOutput];
    [movieFileOutput release];


    // Setup and start the capture session
    AVCaptureSession *session = [[AVCaptureSession alloc] init];

    if ([session canAddInput:videoInput]) {
        [session addInput:videoInput];
    }
    if ([session canAddInput:audioInput]) {
        [session addInput:audioInput];
    }
    if ([session canAddOutput:movieFileOutput]) {
        [session addOutput:movieFileOutput];
    }

    [session setSessionPreset:sessionPreset];


    //  I added this *****************
    dispatch_queue_t queue = dispatch_queue_create("myqueue", NULL);
    [[self videoDataOutput] setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);
    // ******************** end of my code      

    [session startRunning];
    [self setSession:session];
    [session release];
    success = YES;
    return success;
}

What I need is just a method where I can process every frame that is being record开发者_运维知识库ed.

thanks


Having set yourself as the delegate, you'll receive a call to:

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
         didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
         fromConnection:(AVCaptureConnection *)connection

Every time a new frame is captured. You can put whatever code you want in there — just be careful because you won't be on the main thread. It's probably safest to do a quick [target performSelectorOnMainThread:@selector(methodYouActuallyWant)] in -captureOutput:didOutputSampleBuffer:fromConnection:.

Addition: I use the following as setup in my code, and that successfully leads to the delegate method being called. I'm unable to see any substantial difference between it and what you're using.

- (id)initWithSessionPreset:(NSString *)sessionPreset delegate:(id <AAVideoSourceDelegate>)aDelegate
{

#ifndef TARGET_OS_EMBEDDED
    return nil;
#else

    if(self = [super init])
    {
        delegate = aDelegate;

        NSError *error = nil;

        // create a low-quality capture session
        session = [[AVCaptureSession alloc] init];
        session.sessionPreset = sessionPreset;

        // grab a suitable device...
        device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

        // ...and a device input
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];

        if(!input || error)
        {
            [self release];
            return nil;
        }
        [session addInput:input];

        // create a VideDataOutput to route output to us
        AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
        [session addOutput:[output autorelease]];

        // create a suitable dispatch queue, GCD style, and hook self up as the delegate
        dispatch_queue_t queue = dispatch_queue_create("aQueue", NULL);
        [output setSampleBufferDelegate:self queue:queue];
        dispatch_release(queue);

        // set 32bpp BGRA pixel format, since I'll want to make sense of the frame
        output.videoSettings =
            [NSDictionary 
                dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                forKey:(id)kCVPixelBufferPixelFormatTypeKey];

    }

    return self;
#endif
}

- (void)start
{
    [session startRunning];
}

- (void)stop
{
    [session stopRunning];
}


// create a suitable dispatch queue, GCD style, and hook self up as the delegate
        dispatch_queue_t queue = dispatch_queue_create("aQueue", NULL);
        [output setSampleBufferDelegate:self queue:queue];
        dispatch_release(queue);

Also very important into

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection 

be sure to put a

NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init]; at the beginning and a [pool drain] at the end else will crash after too many processes.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜