开发者

CoreGraphics on iPhone. What is the proper way to load a PNG file with pre-multiplied alpha?

For my iPhone 3D apps I am currently using CoreGraphics 开发者_JS百科to load png files that have pre-multiplied alpha. Here are the essentials:

// load a 4-channel rgba png file with pre-multiplied alpha. 
CGContextRef context =  
CGBitmapContextCreate(data, width, height, 8, num_channels * width, colorSpace, 
kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big); 

// Draw the image into the context
CGRect rect =  CGRectMake(0, 0, width, height); 
CGContextDrawImage(context, rect, image); 
CGContextRelease(context); 

I then go on and use the data as a texture in OpenGL.

My question: By specifying pre-multiplied alpha - kCGImageAlphaPremultipliedLast - am I - inadvertently - telling CoreGraphics to multiply the r g b channels by alpha or - what I have assumed - am I merely indicating that the format of the incoming image has pre-multiplied alpha?

Thanks, Doug


By specifying pre-multiplied alpha - kCGImageAlphaPremultipliedLast - am I - inadvertently - telling CoreGraphics to multiply the r g b channels by alpha or - what I have assumed - am I merely indicating that the format of the incoming image has pre-multiplied alpha?

Neither one. You specified the format of the destination (the context), not the source. The source—the image object—knows whether its pixels' colors are premultiplied or not, and Quartz will use that knowledge to do the right thing when it draws the source image into the context.

This means that the pixels in your buffer will be premultiplied—not twice, but they will be premultiplied in the context buffer, whether the source colors were premultiplied or not (if they were, no need to multiply again; if they weren't, then it will multiply for the first time).

I don't know enough OpenGL to know whether that's a problem, and if it is, there is probably no solution for this code: On the Mac, at least, Quartz does not support unpremultiplied colors in bitmap contexts.

You might try this storage format instead.


For future reference to anyone stumbling upon this problem, here's an example on how to de-premultiply:

    /////// .... 
    /////// load the image into imageRef up here
    /////// e.g. with CGImageSourceCreateWithURL(); and CGImageSourceCreateImageAtIndex(); on OS X
    /////// or, by accessing CGImage property of UIImage 
    ///////  
    /////// then extract bitsPerComponent, bytesPerRow and color space 
    /////// .... 


    // draw CG image to a byte array
    CGContextRef bitmapContext = CGBitmapContextCreate(img->data,
                                                       img->w, img->h,
                                                       bitsPerComponent, 
                                                       bytesPerRow,
                                                       space,
                                                       kCGImageAlphaPremultipliedLast);

    CGContextDrawImage(bitmapContext,CGRectMake(0.0, 0.0, (float)img->w, (float)img->h),imageRef);
    CGContextRelease(bitmapContext);

    // let's undo premultiplication
    for(int i=0; i < bytesPerRow * img->h; i+=4)
    {
        for(int j=0; j<3; j++)
        {
            img->data[i+j] = img->data[i+j] / (img->data[i+3]/255.);
        }
    }
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜