开发者

Strange floating point arithmetics in WebGL fragment shader

I'm writing a simple WebGL program. I'm stuck with a strange behavior in a shader program:

condition ( 1.01 * 2.0 > 1.0 ) evaluates to true but condition ( 0.99 * 2.0 > 1.0 ) evaluates to flase

Every time I multiply a number lesser than 1.0 by something else I get a number lesser than 1.0

Why is it so?

[EDIT]

I'm using a fragment shader to change 16bit unsigned integer data to 8bit bitmap using window level and window hight ( linear ramp ) and display on screen. Since there are no direct way of storing 16bit data as an internal format in WebGL (AFAIK) I decided to create Uint8Array(width*height*3), store first byte in R channel, second in B channel and put it in gl.RGB, gl.UNSIGNED_BYTE texture (maybe LUMINESCANCE_ALPHA would be better).

In shader I reconstruct word form bytes and do the leveling. The shader program:

#ifdef GL_ES
precision highp float;
#endif

uniform highp sampler2D   s_texture;
uniform highp float       u_ww;
uniform highp float       u_wc;
varying highp vec2        v_texcoord;

void main(void)
{
     highp vec3 color = texture2D(s_texture, v_texcoord).rgb;
 highp float S = 255.;
 highp float d = 65280.;
 highp float m = 0.9;
 highp float c = color.g*d + color.r*S;

 float dp = 0.;

 float min = u_wc - u_ww/2.;
 float max = u_wc + u_ww/2.;

 if( 1.0*2.0 > 1.1 ) gl_FragData[0] = vec4(1,0,0,1);
 else{
    if( c<min )开发者_开发知识库 c=0.;
    else if( c>max ) c=255.;
    else c=(255.*(max-c))/u_ww; 

    c = c/S;
    gl_FragData[0] = vec4(c,c,c,1);
 }
}

As you can see, there is a stupid line if( 1.0*2.0 > 1.1 ) gl_FragData[0] = vec4(1,0,0,1); It's place where I test floating point arithmetic. In this example whole image is red, but when the condition is 0.999999999*2.0 > 1.1 the answer is false. I started to suspect something when I got strange artifacts in my resampled 16bit image.

I tested it on Chrome 8 and Firefox 4. I believe I don't understand something about floating point arithmetics.


Maybe it isn't using floats.

Try this:

float xa = 1.01; //then change to 0.99
float xb = 1.0;
float xc = 2.0;
if (xa * xb > xc)
    ...

Does that help? If so, maybe the compiler (or hardware) is converting them to int for some reason.


You should strip down the shader to the smallest possible case.

if( 1.0*2.0 > 1.1 ) gl_FragData[0] = vec4(1,0,0,1);
else gl_FragData[0] = vec4(1,0,1,1);

and

if( 0.999999*2.0 > 1.1 ) gl_FragData[0] = vec4(1,0,0,1);
else gl_FragData[0] = vec4(1,0,1,1);

How many 9s do you need to add to trigger the bug? Does it still happen with just 0.9?

Once you have done that, use that code to report a bug with Chrome or Firefox (they use the same GLSL parser). It's entirely possible that there is a bug in parsing constants with too many significant digits.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜