How can I "upload" a RAW image file to the iOS simulator?
How can I "upload" a RAW image file to the iOS simulator so that it appears to the AssetsLibrary framework in the same way as a raw image copied from a camera to an iPad through the Camera Connection Kit?
(I do know how to store normal JPEG and PNG images in the iOS simulator. That is not the 开发者_开发百科question.)
Just use
writeImageDataToSavedPhotosAlbum:metadata:completionBlock:
to write the ImageData of your RAW file to the Saved Photo Albums. iOS supports many RAW formats (Canon and Nikon are definately supported). The AssetLibrary automatically creates jpeg's and previews for your RAW file, when you add it to the Library.
I'm not really familiar with RAW data, but I was able to convert a PNG image to raw data, and then convert back to image. Here's the code I used to convert to image:
- (UIImage *) convertBitmapRGBA8ToUIImage:(unsigned char *) buffer
withWidth:(int) width
withHeight:(int) height {
size_t bufferLength = width * height * 4;
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer, bufferLength, NULL);
size_t bitsPerComponent = 8;
size_t bitsPerPixel = 32;
size_t bytesPerRow = 4 * width;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
if(colorSpaceRef == NULL) {
NSLog(@"Error allocating color space");
CGDataProviderRelease(provider);
return nil;
}
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
CGImageRef iref = CGImageCreate(width,
height,
bitsPerComponent,
bitsPerPixel,
bytesPerRow,
colorSpaceRef,
bitmapInfo,
provider, // data provider
NULL, // decode
YES, // should interpolate
renderingIntent);
uint32_t* pixels = (uint32_t*)malloc(bufferLength);
if(pixels == NULL) {
NSLog(@"Error: Memory not allocated for bitmap");
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpaceRef);
CGImageRelease(iref);
return nil;
}
CGContextRef context = CGBitmapContextCreate(pixels,
width,
height,
bitsPerComponent,
bytesPerRow,
colorSpaceRef,
kCGImageAlphaPremultipliedLast);
if(context == NULL) {
NSLog(@"Error context not created");
free(pixels);
}
UIImage *image = nil;
if(context) {
CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, width, height), iref);
CGImageRef imageRef = CGBitmapContextCreateImage(context);
// Support both iPad 3.2 and iPhone 4 Retina displays with the correct scale
if([UIImage respondsToSelector:@selector(imageWithCGImage:scale:orientation:)]) {
float scale = [[UIScreen mainScreen] scale];
image = [UIImage imageWithCGImage:imageRef scale:scale orientation:UIImageOrientationUp];
} else {
image = [UIImage imageWithCGImage:imageRef];
}
CGImageRelease(imageRef);
CGContextRelease(context);
}
CGColorSpaceRelease(colorSpaceRef);
CGImageRelease(iref);
CGDataProviderRelease(provider);
if(pixels) {
free(pixels);
}
return image;
}
I can't really remember the source, but it worked for me.
Were you able to load the RAW image using your code? If yes, just pass the data to the above function (you will also need the width and height of your image to be converted correctly).
精彩评论