开发者

How to play video on iOS that's loaded manually from a socket?

Rather tricky one this...

I'm trying to stream a video (H264) across a network on iOS. However, I'm getting the video data into a buffer through an open socket to the remote server (using CocoaAsyncSocket), so I don't have a URL to the video that I can use to create an AVAsset or an MPMoviePlayer. The video is a live stream, so the data will just keep coming (i.e. No set duration) if that makes any difference.

I'm having to do it this way as the server is an RTSP server. I've written my own RTSP clien开发者_JAVA技巧t for the sending of commands and receiving of responses, and now I'm trying to do something useful with the video data that comes over the connection.

Any ideas on how I can play back this video? The only things I can think of currently are somehow saving to a file and loading that (but I don't see how that'll work, as I'll be continually loading new data), or resorting to doing it manually somehow with something like ffmpeg. And no, unfortunately I can't make the server do HTTP Live Streaming instead.

Any help would be greatly appreciated!


I haven't had to dig this deeply into AVFoundation yet, but you might be able to pull it off by creating an AVAsset with an AVAssetWriter. You provide AVAssetWriter an instance of AVAssetWriterInput which takes CMSampleBuffer data and packages it up for the AVAssetWriter.

Based on the docs for AVAssetWriterInput, it is designed to take data from a "real-time source".

I wish I could be of more help, but hopefully this will point you in the right direction.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜