Displaying a YUV formatted video stream in iOS
I am writing an iOS application that gets live video streams from analog security cameras. I can get the video stream from our server application and decode it from it's proprietary format on the phone. The decoder leaves me with raw YUV (Y'CrCb technically) data. I'm not really sure what the best way (or even how) to display this info.
I've read that I should manually convert to RGB and display in something like a UIImageView, but that seems fairly clunky when there could be upwards of 30 fps on the video stream.
I've also read to use OpenGL with the YUV info to create a 2d texture and display that. Unfortunately I have no idea where to even begin with this a开发者_如何转开发nd I'm not even sure if this is the direction I want to pursue.
So my question to all of you is: What's the best way to display this information on an iOS device? Secondly if this requires something like OpenGL could anyone suggest a good tutorial, book, code sample or any other learning resource so I can learn more about it.
Thanks in advance.
The best way really is to let the GPU do the job. I do know it's possible with a shader program, but frankly I don't speak OpenGL. This question might be of help, though.
精彩评论