I am going insane with this one - have looked everywhere and tried anything and everything I can thinks of.
I have successfully composed an AVMutableComposition with multiple video clips and can view it and export it, and I would like to be able to transition between them using a cross-fade, 开发者_运维问答
Is there any way using AVFoundat开发者_如何学Goion and CoreVideo to get color info, aperture and focal length values in real-time?
I am trying to create a CMSampleBuffer Ref from the data and trying to feed it to AVAssetWriter. But as开发者_开发知识库set writer is failing to create the movie from the data. Following is the code
I think that it\'s clear f开发者_JS百科rom the title... The Problem: I already have some objective-c code that uses AVFoundation and its AVAudioPlayer to play some sounds in my iPhone apps, it takes
I hav开发者_如何学Ce a serious problem: I have an NSArray with several UIImage objects. What I now want to do, is create movie from those UIImages. But I don\'t have any idea how to do so.
I’d like to convert a CGImage to CMSampleBufferRef and append it to a AVAssetWriterInput using the appendSampleBuffer: method. I’ve managed to get the CMSampleBufferRef using the following code, but
so I have this variable in my delegate named AppDelegate.h: AVAudioPlayer *introSound; It plays continuously during first load.
Is it possible to convert a UIImage instance to a CMSampleBufferRef so that it can be appended to a specified output file using AVAssetWriter\'s appendSampleBuffer: method?
The application currently I am using has a main functionality to scan QR/Bar codes continuously using Zxing library (http://code.google.com/p/zxing/). For continuous frame capturing I used to initiali