开发者

Capturing video while processing it through a shader on iPhone

I am trying to develop an iPhone app that processes/filters and records video.

I have two sample apps that have aspects of what I need and am trying to combine them.

  1. AVCamDemo from the WWDC10 sample code package (Apple Developer ID required)

    This deals with capturing/recording video.

  2. Brad Larson's ColorTracking sample app referenced here

    This deals with live processing of video using OpenGL ES

I get stuck when trying to combine the two.

What I have been trying to do is to use AVCaptureVideoOutput and the AVCaptureVideoDataOutputSampleBufferProtocol to process/filter the video frames through OpenGL ES (as in -2-), and at the same time somehow use AVCaptureMovieFileOutput to record the processed video (as in -1-).

Is this approach possible? If so, how would I need to set up the connections within the AVSession?

Or do I need to use the AVCaptureVideoDataOutputSampleBufferProtocol to process/filter the video AND then recombine the individual frames back into a movie - without using AVCaptureMovieFileOutput to save the movie file?

Any suggestions for the best approach to accomplish this a开发者_如何学运维re much appreciated!

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜