I have an AVMutableComposition with the following track segments: video: empty: Y, {{0/1 = 0.000}, {48/100 = 0.480}}
I want to get the pixels of the image from the camera. I am using CoreFoundation and OpenGL and I can render the image but I want to do some other other things (in other place/thread) so I need to cop
I am trying to capture an image during a live preview from the camera, by AVFoundation captureStillImageAsynchronouslyFromConnection. So far the program wor开发者_StackOverflowks as expected. However,
I\'m currently using an AVPlayer, along with an AVPlayerLayer to play back some video. While playing back the video, I\'ve registered for time updates every 30th of a second during the video. This is
I am trying to access the raw data for an audio file on the iPhone/iPad. I have the following code which is a basic start down the path I need. However I am stumped at what to do once I have an AudioB
I\'m trying to do some image processing on iPhone. I\'m using http://developer.apple.com/library/ios/#qa/qa2010/qa1702.html to capture the camera frames.
Is there any reason why this wouldn\'t w开发者_开发百科ork? I\'m testing for sound in the microphone and playing a looping sound effect as I\'m registering sound.
I\'m trying to use AVCaptureSession to get images from the front camera for processing. So far each time a new frame was available I simply assigned it to a variable, and ran an NSTimer that checks ev
I have written the code to export a song from iPod to my app. When I try to export a song from iPod, I am getting following error message.
I am assembling a bunch of video clips filmed on the iPhone in portrait mode. To assemble them I am taking straightforward approach as follows: