开发者

AVCaptureSession returning blank image on iphone 4

I am trying to call the 'media capture putting it all together' example from AVFoundation Programming Guide. I keep getting a blank (black) image. Is there anything I need to call first to have this code access the camera? thanks

This is unmodified code from the example:

-(void) setupCapture {
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetLow;

AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
    // Handle the error appropriately.
    NSLog(@"no input");
    return;
}
[session addInput:input];   

AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[session addOutput:output];
output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                                                   forKey:(id)kCVPixelBufferPixelFormatTypeKey];
output.minFrameDuration = CMTimeMake(1, 15);

dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

[session startRunning];
}

UIImage *imageFromSampleBuffer(CMSampleBufferRef sampleBuffer) {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer.
CVPixelBufferLockBaseAddress(imageBuffer,0);

// Get the number of bytes per row for the pixel buffer.
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height.
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);

// Create a device-dependent RGB color space.
static CGColorSpaceRef colorSpace = NULL;
if (colorSpace == NULL) {
    colorSpace = CGColorSpaceCreateDeviceRGB();
    if (colorSpace == NULL) {
        // Handle the error appropriately.
        return nil;
    }
}

// Get the base address of the pixel buffer.
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the data size for contiguous planes of the pixel buffer.
size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);

// Create a Quartz direct-access data provider that uses data we supply.
CGDataProviderRef dataProvider =
CGDataProviderCreateWithData(N开发者_如何学编程ULL, baseAddress, bufferSize, NULL);
// Create a bitmap image from data supplied by the data provider.
CGImageRef cgImage =
CGImageCreate(width, height, 8, 32, bytesPerRow,
              colorSpace, kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
              dataProvider, NULL, true, kCGRenderingIntentDefault);
CGDataProviderRelease(dataProvider);

// Create and return an image object to represent the Quartz image.
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);

CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

return image;
}

and my callback method is:

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
   fromConnection:(AVCaptureConnection *)connection {

UIImage *image = imageFromSampleBuffer(sampleBuffer);
NSLog(@"am I nil?: %@", image);
self.imageV.image = image;
[self.view setNeedsDisplay];
}


tip 1: do not start capturesession in main viewDidLoad method, but a little later tip 2: do not update your ui in the capturesession callback method, do it on the main thread.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜