Convert from the CMSampleBuffer to a UIImage object
开发者_高级运维I tried an example for capture video frames from the camera as images using AV Foundation that is given in documentation i.e.,
http://developer.apple.com/library/ios/#qa/qa1702/_index.html
But in delegate method to get the UIImage object from the CMSampleBufferRef
the method that has given is not building.
Means I have imported AVFoundation framework also but it is giving 14 errors like _CVPixelBufferUnlockBaseAddress
, referenced from: .
If any one know how to solve please help me.
Thanks in advance .Please if any one know tell to me.
This is the code to capture and get the image from captured data:
-(IBAction)startCapture
{
//session object
captureSession = [[AVCaptureSession alloc]init];
captureSession.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.frame = CGRectMake(0, 10, 320, 200); ////self.view.frame; //
[self.view.layer addSublayer:previewLayer];
NSError *error = nil;
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
//input object
AVCaptureDeviceInput *inputDevice = [[AVCaptureDeviceInput alloc]initWithDevice:device error:&error];
[captureSession addInput:inputDevice];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[captureSession addOutput:stillImageOutput];
[captureSession startRunning];
}
-(IBAction) captureNow
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(@"about to request a capture from: %@", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
NSLog(@"exif Attachments:%@",exifAttachments);
if (exifAttachments)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.vImage.image = image;
// Do something with the attachments.
}
else
NSLog(@"no attachments");
}];
}
What is your end goal - are you trying to record video to a video file. Or just capture specific frames? I have a working example of capturing a video file with audio using AVCaptureSession, if it would help I can post some code snippets - but seeing as there are a few bits and bobs involved I would like to know specifically what you are trying to do.
Cheers,
Michael
To save to the library once the assetWriter has finished recording the movie/capturing the image.
Put this at the top above the implementation file *this is example of saving a video file but you could change that to save your image:
ALAssetsLibraryWriteVideoCompletionBlock _videoCompblock = ^(NSURL *assetURL, NSError *error){
if(assetURL){
NSLog(@"Saved to camera roll with Video AssetUrl : %@", [assetURL absoluteString]);
NSError *error;
NSFileManager *fileManager = [NSFileManager defaultManager];
NSDictionary *attributes = [fileManager attributesOfItemAtPath:[assetURL absoluteString] error:&error];
if (attributes){
long fileSize = [[attributes objectForKey:NSFileSize] longValue];// unsigned long long
NSLog(@"%d", fileSize);
}
}else if(error){
NSLog(@"The Error occured : %@", [error localizedDescription]);
}
};
Then you need a function that uses the above block - so when you capture session finishes recording have something like this:
-(void) stopRecording{
writing = NO;
isRecording = NO;
[audioInput markAsFinished];//if you have an audio writer stop it too
[videoInput markAsFinished];
[assetWriter endSessionAtSourceTime:[frTimer currentTimeStamp]];
[assetWriter finishWriting];
finished = YES;
[videoUtilities saveToCamera:[assetWriter outputURL]];
NSLog(@"%@", [[assetWriter outputURL] absoluteString]);
}
which triggers the save to camera function which will look kinda like this:
+(void) saveToCamera:(NSURL *)urlPath{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
AVAsset *movFile = [AVURLAsset URLAssetWithURL:urlPath options:nil];
NSLog(@"Movie File %@", movFile);
BOOL isSupported = [library videoAtPathIsCompatibleWithSavedPhotosAlbum:urlPath];
if(isSupported){
NSLog(@"IS SUPPORTED - SAVING TO CAMERA ROLL");
[library writeVideoAtPathToSavedPhotosAlbum:urlPath completionBlock:_videoCompblock];
}
}
If you are trying to display the saved image realtime as you take it - you need to copy the UIImageData to a UIImage object when you take the photo - and assign that as the image for the UIImageView. Or you can just enumerate through the assetLibrary and pull it from there.
Hope this helps,
Cheers,
Michael
you have to import CoreMedia.framework and CoreVideo.framework
精彩评论