In my app i insert an edited audio track over a video track and save the result in a single file (by an AVMutableComposition first and then by an exportAsynchronouslyWithCompletionHandler). I have use
I\'m working on an app that lets users record voice (among other things) to the Documents directory of the app. But when I\'m recording the voice, I\'m recording to the caches directory of the app and
long time reader, first time asker... I am making a music app which uses AVAssetReader to read mp3 data from the itunes library. I need precise timing, so when I create an AVURLAsset, I use the \"AVU
Should I use AVCaptureConnection\'s videoOri开发者_开发知识库entation property, or set the transform property of the videoWriterInput, or something else?Seems like AVCaptureConnection\'s videoOrientat
I am using AVPlayer to view videos stored on Amazon\'s CloudFront -- Live HTTP protocol is used and the playlist and segments are stored on S3 and hosted using CloudFront.
I am new to the whole AVFoundation thing. Before I was using the good old UIImagePickerControllerSourceTypeCamera.
I am currently trying to implement a very simple UIView to replace the UIImagePickerController and am running into lag with the image being captured.
I\'m creating an app for playing a ringtone and I\'d want to know the current time in milliseconds of the played ringtone every time.
I want to display a video frame buffer on a OpenGLES texture. I have download and read the GLVideoFrame sample 开发者_Go百科from apple.
I\'m using AV Foundation to play an MP3 file loaded over the network, with code that is almost identical to the playback example here: Putting it all Together: Playing a Video File Using AVPlayerLayer