开发者

Draw a texture in OpenGL while ignoring its alpha channel

I have a texture loaded into memory that is of RGBA format with various alpha values.

The image is loaded as so:

 GLuint texture = 0;
 glGenTextures(1, &texture);
 glBindTexture(GL_TEXTURE_2D, texture);
 self.texNum = texture;

 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,GL_LINEAR); 
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,GL_LINEAR); 

 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

 glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, self.imageWidth, self.imageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, [self.imag开发者_Python百科eData bytes]);

I want to know how I can draw this texture so that the alpha channel in the image is treated as all 1's and the texture is drawn like an RGB image.

Consider the base image:

Draw a texture in OpenGL while ignoring its alpha channel

This image is a progression from 0 to 255 alpha and has the RGB value of 255,0,0 throughout

However if I draw it with blending disabled I get an image that looks like: www.ldeo.columbia.edu/~jcoplan/alpha/no_alpha.png

When what I really want is an image that looks like this: www.ldeo.columbia.edu/~jcoplan/alpha/correct.png

I'd really appreciate some pointers to have it ignore the alpha channel completely. Note that I can't just load the image in as an RGB initially because I do need the alpha channel at other points.

Edit: I tried to use GL_COMBINE to solve my problem as such:

glColorf(1,1,1,1);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE);

glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_RGB, GL_REPLACE);
glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_RGB, GL_TEXTURE);
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_SRC_COLOR);

glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_ALPHA, GL_REPLACE);
glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_ALPHA, GL_PRIMARY_COLOR);
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_ALPHA, GL_SRC_ALPHA); 
[self drawTexture];

But still no luck, it draws black to red still.


I have a texture loaded into memory that is of RGBA format with various alpha values

glDisable(GL_BLEND)

However if I draw it with blending disabled I get an image that looks like: www.ldeo.columbia.edu/~jcoplan/alpha/no_alpha.png

This happens because in your source image all transparent pixels are black. It's a problem with your texture/image, or maybe with loader function, but it is not an OpenGL problem.

You could probably try to fix it using glTexEnv(GL_COMBINE... ) (i.e. mix texture color with underlying color based on alpha channel), but since I haven't done something like that, i'm not completely sure, and can't give you exact operands. It was possible in Direct3D9 (using D3DTOP_MODULATEALPHA_ADDCOLOR), so most likely there is a way to do it in opengl.


You should not disable blending but use the glBlendFunc with proper parameters:

glBlendFunc(GL_ONE, GL_ZERO);


Or you could tell OpenGL to upload only the RGB channels of your image using

glPixelStorei(GL_UNPACK_ALIGNMENT, 4)

before you call glTexImage2D with format set to GL_RGB. It will cause it to skip the fourth byte of every pixel, i.e. the alpha channel.


I had a similar problem, and found out that it was because iOS image loading was doing a premultiply on the RBG values (as discussed in some of the other answers and comments here). I'd love to know whether there's a way of disabling pre-multiplication, but in the meantime I'm "un-pre-multiplying" using code derived from this thread and this thread.

    // un pre-multiply
    uint8_t *imageBytes = (uint8_t *)imageData ;
    int byteCount = width*height*4 ;
    for (int i=0; i < byteCount; i+= 4) {
        uint8_t a = imageBytes[i+3] ;
        if (a!=255 && a!=0 ){
            float alphaFactor = 255.0/a ;
            imageBytes[i] *= alphaFactor ; 
            imageBytes[i+1] *= alphaFactor ; 
            imageBytes[i+2] *= alphaFactor ; 
        }
    }
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜