I\'m trying to create a PhoneGap plugin which uses AFFoundation to capture photos and return the image as base64 encoded string. This all works fine but the exif data seems to be missing from the retu
I am writing an iPhone app which takes video from the camera, runs it through some OpenGL shader code and then writes the output to a video file using AVFoundation. The app runs in lanscape orientatio
I try to save the sample buffer instead of an UIImage to an array, to convert it later on. This to speed up the image capturing and maybe not get memory warnings. I just can\'t figure out how to save
I\'ve looked and looked for an answer, but can\'t seem to find o开发者_运维知识库ne. Lots have asked, but none have gotten answers. I have an app that records audio using AVAudioRecorder. Now I just w
I\'m building an app that shows a video preview layer using AVCaptureVideoPreviewLayer. The setup of that is pretty straight-forward, and it seems to work well:
I want to make a video chat app which allows two users to do video chat using there iphones or ipad2, I started with using AVFoundation framework and using socket connection (CFStreams) to open a sock
I\'m trying to use the AudioPlayer class that\'s included in Apple\'s AVFoundation framework. The problem is xcode for some reason won\'t properly import the AVFoundation framework. I can\'t use the #
I\'m using AVFoundation on OSX Lion to do screen capture.Accomplished as follows: self->screenInput = [[AVCaptureScreenInput alloc] initWithDisplayID:self->screen];
I am a newbie trying to capture camera video images using AVFoundation and want to render the captured frames without using AVCaptureVideoPreviewLayer. I
I\'m using AVAudioPlayer class to play sound in my app. When i play sound first time, screen freezes for a 2-3 seconds, then it becomes active and without freezes and delays. Even when I change sound.