开发者

How to use AVCaptureSession to stream live preview video, then take a photo, then return to streaming

I have an application that creates its own live preview prior to taking a still photo. The app needs to run some processing on the image data and thus is not able to rely on AVCaptureVideoPreviewLayer. Getting the initial stream to work is going quite well, using Apple's example code. The problem comes when I try开发者_JAVA百科 to switch to the higher quality image to take the snapshot. In response to a button press I attempt to reconfigure the session for taking a full resolution photo. I've tried many variations but here is my latest example (which still does not work):

- (void)sessionSetupForPhoto
{
 [session beginConfiguration];
 session.sessionPreset = AVCaptureSessionPresetPhoto;
 AVCaptureStillImageOutput *output = [[[AVCaptureStillImageOutput alloc] init] autorelease];
 for (AVCaptureOutput *output in [session outputs]) {
  [session removeOutput:output];
 }
 if ([session canAddOutput:output]){
  [session addOutput:output];
 } else {
  NSLog(@"Not able to add an AVCaptureStillImageOutput");
 }
 [session commitConfiguration];
}

I am consistently getting an error message just after the commitConfiguration line that looks like this: (that is to say, I am getting an AVCaptureSessionRuntimeErrorNotification sent to my registered observer)

Received an error: NSConcreteNotification 0x19d870 {name = AVCaptureSessionRuntimeErrorNotification; object = ; userInfo = { AVCaptureSessionErrorKey = "Error Domain=AVFoundationErrorDomain Code=-11800 \"The operation couldn\U2019t be completed. (AVFoundationErrorDomain error -11800.)\" UserInfo=0x19d810 {}";

The documentation in XCode ostensibly provides more information for the error number (-11800), "AVErrorUnknown - Reason for the error is unknown.";

Previously I had also tried calls to stopRunning and startRunning, but no longer do that after watching WWDC Session 409, where it is discouraged. When I was stopping and starting, I was getting a different error message -11819, which corresponds to "AVErrorMediaServicesWereReset - The operation could not be completed because media services became unavailable.", which is much nicer than simply "unknown", but not necessarily any more helpful.

It successfully adds the AVCaptureStillImageOutput (i.e., does NOT emit the log message).

I am testing on an iPhone 3g (w/4.1) and iPhone 4.

This call is happening in the main thread, which is also where my original AVCaptureSession setup took place.

How can I avoid the error? How can I switch to the higher resolution to take the photo?

Thank you!


Since you're processing the video data coming out of the AVCaptureSession, I'm assuming you have an AVCaptureVideoDataOutput connected to it prior to calling sessionSetupForPhoto.

If so, can you elaborate on what you're doing in captureOutput:didOutputSampleBuffer:? Without being able to see more, I'm guessing there may be a problem with removing the old outputs and subsequently setting the photo quality preset.

Also, the output variable you're using as an iterator when you remove your outputs is hiding the still image output. Not a problem, but it makes the code a little harder to read.


There is no need to switch sessions. Just add AVCaptureStillImageOutput to your session on initialization and call the following when you are about to capture the image and use the CMSampleBufferRef accordingly:

captureStillImageAsynchronouslyFromConnection:videoConnection
   completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) 
{
}
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜