开发者

Grabbing the first frame of a video from UIImagePickerController?

I'm trying to get the 开发者_如何学Cfirst frame from the selected video in a UIImagePickerController to show in a UIImageView, but I do not know if it's possible. If it is, how would I do it?


You can do this in one of two ways. The first way is to use the MPMoviePlayerController to grab the thumbnail:

MPMoviePlayerController *moviePlayer = [[MPMoviePlayerController alloc]
                                       initWithContentURL:videoURL];
moviePlayer.shouldAutoplay = NO;
UIImage *thumbnail = [moviePlayer thumbnailImageAtTime:time
                     timeOption:MPMovieTimeOptionNearestKeyFrame];

This works, but MPMoviePlayerController is not a particularly lightweight object and not particularly fast grabbing thumbnails.

The preferred way is to use the new AVAssetImageGenerator in AVFoundation. This is fast, lightweight and more flexible than the old way. Here's a helper method that will return an autoreleased image from the video.


+ (UIImage *)thumbnailImageForVideo:(NSURL *)videoURL 
                             atTime:(NSTimeInterval)time 
{

    AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
    NSParameterAssert(asset);
    AVAssetImageGenerator *assetIG = 
                [[AVAssetImageGenerator alloc] initWithAsset:asset];
    assetIG.appliesPreferredTrackTransform = YES;
    assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;

    CGImageRef thumbnailImageRef = NULL;
    CFTimeInterval thumbnailImageTime = time;
    NSError *igError = nil;
    thumbnailImageRef = 
             [assetIG copyCGImageAtTime:CMTimeMake(thumbnailImageTime, 60)
                             actualTime:NULL
                                  error:&igError];

    if (!thumbnailImageRef)
        NSLog(@"thumbnailImageGenerationError %@", igError );

    UIImage *thumbnailImage = thumbnailImageRef 
                          ? [[UIImage alloc] initWithCGImage:thumbnailImageRef]
                          : nil;

    return thumbnailImage;
}

Asynchronous usage


- (void)thumbnailImageForVideo:(NSURL *)videoURL atTime:(NSTimeInterval)time completion:(void (^)(UIImage *)) completion
{
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{

        AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
        NSParameterAssert(asset);
        AVAssetImageGenerator *assetIG =
        [[AVAssetImageGenerator alloc] initWithAsset:asset];
        assetIG.appliesPreferredTrackTransform = YES;
        assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;

        CGImageRef thumbnailImageRef = NULL;
        CFTimeInterval thumbnailImageTime = time;
        NSError *igError = nil;
        thumbnailImageRef =
        [assetIG copyCGImageAtTime:CMTimeMake(thumbnailImageTime, 60)
                        actualTime:NULL
                             error:&igError];

        if (!thumbnailImageRef)
            NSLog(@"thumbnailImageGenerationError %@", igError );

        UIImage *thumbnailImage = thumbnailImageRef
        ? [[UIImage alloc] initWithCGImage:thumbnailImageRef]
        : nil;

        dispatch_async(dispatch_get_main_queue(), ^{
            completion(thumbnailImage);
        });
    });
}
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜