开发者

OpenGL Texture Quality Issues

A while ago I converted a C# program of mine to use OpenGL and found it ran perfectly (and faster) on my Computer at Home. However, I have 2 issues. Firstly, the code I use to free textures from the graphics card doesn't word, it gives me a memory access violation exception at runtime. Secondly, most of the graphics don't work on any other machine but mine.

By accident, I managed to convert some of the graphics to 8-bit PNGs (all the others are 32bit) and these work fine on other machines. Recognising this, I attempted to regulate the quality when loading the images. My attempts failed (this was a while ago, I think they largely involved trying to format a bitmap then using the GDI to draw the texture o开发者_如何学运维nto it, creating a lower quality version). Is there any way in .NET to take a bitmap and nicely change the quality? The code concerned is below. I recall it is largely based on some I found on Stack Overflow in the past, but which didn't quite suit my needs. 'img' as a .NET Image, and 'd' is an integer dimension, which I use to ensure the images are square.

uint[] output = new uint[1];

Bitmap bMap = new Bitmap(img, new Size(d, d));

System.Drawing.Imaging.BitmapData bMapData;
Rectangle rect = new Rectangle(0, 0, bMap.Width, bMap.Height);

bMapData = bMap.LockBits(rect, System.Drawing.Imaging.ImageLockMode.ReadOnly, bMap.PixelFormat);

gl.glGenTextures(1, output);
gl.glBindTexture(gl.GL_TEXTURE_2D, output[0]);
gl.glTexParameteri(gl.GL_TEXTURE_2D, gl.GL_TEXTURE_MAG_FILTER, gl.GL_NEAREST);

gl.glTexParameteri(gl.GL_TEXTURE_2D,gl.GL_TEXTURE_MIN_FILTER, gl.GL_NEAREST);
gl.glTexParameteri(gl.GL_TEXTURE_2D, gl.GL_TEXTURE_WRAP_S, gl.GL_CLAMP);
gl.glTexParameteri(gl.GL_TEXTURE_2D, gl.GL_TEXTURE_WRAP_T, gl.GL_CLAMP);
gl.glPixelStorei(gl.GL_UNPACK_ALIGNMENT, 1);
if (use16bitTextureLimit)
 gl.glTexImage2D(gl.GL_TEXTURE_2D, 0, gl.GL_RGBA_FLOAT16_ATI, bMap.Width, bMap.Height, 0, gl.GL_BGRA, gl.GL_UNSIGNED_BYTE, bMapData.Scan0);
else
 gl.glTexImage2D(gl.GL_TEXTURE_2D, 0, gl.GL_RGBA, bMap.Width, bMap.Height, 0, gl.GL_BGRA, gl.GL_UNSIGNED_BYTE, bMapData.Scan0);

bMap.UnlockBits(bMapData);
bMap.Dispose();

return output;

The 'use16bitTextureLimit' is a bool, and I rather hoped the code shown would reduce the quality to 16bit, but I havn't noticed any difference. It may be that this works and the Graphics cards still don't like it. I was unable to find any indication of a way to use 8-bit PNgs. This is in a function which returns the uint array (as a texture address) for use when rendering. The faulty texture disposale simply involves: gl.glDeleteTextures(1, imgsGL[i]); Where imgGL is an array of unit arrays.

As said, the rendering is fine on some computers, and the texture deletion causes a runtime error on all systems (except my netbook, where I can't create textures atall, though I think that may be linked to the quality issue).

If anyone can provide any info of relevance, that would be great. I've spent many days on the program, and would really like to more compatible with less good graphics cards.


The kind of access violation you encounter usually happens if the call to glTexImage2D causes a buffer overrun. Double check that all the glPixelStore parameters related to unpacking are properly set and that the format parameter (the second one that is) matches the type and size of the data you supply. I know this kind of bg very well, and those are the first checks I usually do, whenever I encounter it.

For the texture not showing up: Did you check, that the texture's dimensions are actually powers of two each?In C using a macro the test for power of two can be written like this (this one boils down to testing, that there's only one of the bits of a integer is set)

#define ISPOW2(x) ( x && !( (x) & ((x) - 1) ) )

It is not neccessary that a texture image is square, though. Common misconception, but you really just have to make sure that each dimension is a power of 2. A 16×128 image is perfectly fine.

Changing the internal format to GL_RGBA_FLOAT16_ATI will probably even increase quality, but one can not be sure, as GL_RGBA may coerce to to anything the driver sees fit. Also this is a vendor specific format, so I disregard it's use. There are all kinds of ARB formats, also a half float one (which FLOAT16_ATI is).

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜