开发者

Has anyone been able to play a video file and show live camera feed at the same time in separate views on iOS?

I have been trying to do this for a few days now using AVFoundation as well as trying to use MPMoviePlayerViewController. The clos开发者_JAVA百科est I can get is allowing one to play at a time. I would like to think that this is possible because of Facetime. However, I know this is a little different because there is no separate video file.

Any ideas would help, and thanks.


I'm not sure where this is documented, but to get AVCaptureVideoPreviewLayer and MPMoviePlayerViewController to play together at the same time you need to set a mixable audio session category first.

Here's one way to do that:

AVAudioSession* session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayback error:nil];
UInt32 mixable = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(mixable), &mixable);
[session setActive:YES error:nil];

See the Audio Session Programming Guide and Audio Session Cookbook for more info.


Have you tried to play video on one thread and recording video on another? That would allow both of them to run while maintaining their separation.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜