开发者

Camera differences between UIImagePickerController and AVCaptureSession on iPhone

I'开发者_如何学Cm trying to build a replacement for UIImagePickerController, using AVCaptureSession with AVCaptureDeviceInput and AVCaptureStillImageOutput, as input/output respectively.

To preview the camera stream I'm using AVCaptureVideoPreviewLayer.

It's now working correctly for capturing and storing photos just like the default camera.

However, I found 3 problems I was unable to solve:

  • photos captured don't get the same quality the default camera provides
  • the viewing/capture angle is shortened, just like using the video capture on the default camera
  • no way to control camera specific options like flash

Is there any way to get to the level of UIImagePickerController using a more customizable approach (i.e. AVFoundation or any other)?


Check out "Session 409 - Using the Camera with AV Foundation" in the WWDC 2010 videos. Based on the video, it looks like you can resolve all three of your issues with AVFoundation.

Hope this helps!

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜