开发者

iOS Multiple AVPlayer objects results in loss of audio/video sync

I've been trying to use two different AVQueuePlayer objects within my app. The first player plays a series of short video clips streamed over the net. The second object plays only one video, but it is much longer in length. It is also streamed. I have verified all the clips have proper audio/video sync.

What I've noticed is that if I create one AVPlayer object after having created another one beforehand, the audio/video sync in the second player is lost. The audio is played between about 800ms - 1500ms too early.

I've tried a number of things, including adding a delay between cleaning up the first player and allocating the second player, removing all calls to the AudioSession code etc. None of this seems to help. Very occasionally the audio will be in sync, but it really only happens about 1 in 30 times.

Has anyone else seen the same behavior? Does anyon开发者_运维知识库e know how to fix this?

Thanks to anyone that can help!


I'm facing the same problem myself. I came across some information in the AVPlayerLayer documentation:

During playback, AVPlayer may compensate for temporal drift between its visual output and its audible output to one or more independently-clocked audio output devices by adjusting the timing of its associated player layers. The effects of these adjustments are usually very small; however, clients that wish to remain entirely unaffected by such adjustments may wish to place other layers for which timing is important into independently timed subtrees of their layer trees.

You can create arbitrary numbers of player layers with the same AVPlayer object. Only the most-recently-created player layer will actually display the video content on-screen.

Unfortunately I haven't deciphered this into actual code but I figured it might help point you in the right direction. If you do come up with a solution then please post it here and I will do the same.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜