开发者

HLSL: problematic pixelshader code (alpha at zero when sampling)?

I have this strange problem with the sampler in the pixel shaders. When I sample from a sampler into an empty float4 variable I always get black/transparent color back. So if I use this I get a black screen:

float4 PixelSha开发者_高级运维derFunction(float2 TextureCoordinate : TEXCOORD0) : COLOR0
    {
        float2 uv = TextureCoordinate;          
        float4 pixelColor = tex2D(implicitInputSampler, uv);

        //contrast
        pixelColor.rgb = ((pixelColor.rgb - 0.5f) * max(Contrast, 0)) + 0.5f;

        //brightness
        pixelColor.rgb = pixelColor.rgb + (Brightness - 1);

        // return final pixel color
        return pixelColor;
    }

I I use this instead it works ok:

float4 PixelShaderFunction(float2 TextureCoordinate : TEXCOORD0) : COLOR0
    {
        float2 uv = TextureCoordinate;          
        float4 pixelColor = {0,0,0,1};
        pixelColor += tex2D(implicitInputSampler, uv);

        //contrast
        pixelColor.rgb = ((pixelColor.rgb - 0.5f) * max(Contrast, 0)) + 0.5f;

        //brightness
        pixelColor.rgb = pixelColor.rgb + (Brightness - 1);

        // return final pixel color
        return pixelColor;
    }

This happens only on my dev environment at home on a AMD 4850 GPU. When I try it on some nVidias or AMD5850 it works in any case...

What is the reason for this? Did I miss some device initialization?

Cheers!


It seems that the A in the pixel shader source texture indeed is 0...

The texture is declared as A8R8G8B8 and device.StretchRectangle is used to fill the texture.

This method works differently depending on hardware :/


I would be interested to know if the texture is 32 bit. Mainly because I have come up against issues of AMD vs Nvidia. In fact, what would be a non functional shader in one case was fully functional in another (AMD worked, NVIDIA didnt) just because I had an offset in my vertex buffer declaration off by 1 byte...but I digress, my question that each driver might be taking liberties on the interpretation of the alpha component if it is undeclared or not a true 32 bit texture. Does the same thing occur if you use a different texture? Is your alpha channel set to opaque (1.0f)?

Interested seeing I have had the old AMD/NVIDIA Driver behaviour differences before. What works in one can actually be a error in the other driver.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜