SlimDX/DirectX9/C# - How to access pixel-data in a Texture
This is my first question ever on StackOverflow, hurray! I can honestly say I use StackOverflow on daily basis for both my work and personal programming mysteries. 99,9% of the time I actually find the answer I need on here too, which is great!
My current problem actually stumped me a little as I can't seem to find anything which actually works. I've already read several posts on GameDev.net and found other resources around the net but can't sort it out.
I am in the process of porting a small 2D engine I wrote for XNA to SlimDX (just DirectX9 at the moment), which has been a good move as I learned more about inner-workings of DirectX in just a few days than I did in six months of working with XNA. I got most of my basic rendering features done and actually managed to recreate the XNA SpriteBatch with a ton of additional features (which I really missed in XNA).
One of the last things I'm trying to get to work is to extract a source-rectangle from a given texture and use it for tiling. Why: When not tiling you can just mess around with the UV to get the source you want to display (eg: 0.3;0.3 to 0.5;0.5), but when tiling you need the UV to tile (0;0 to 2;2 means tile images twice) and therefore need a cutout texture.
To make a long story short, I try to use the following:
DataRectangle dataRectangle = sprite.Texture.LockRectangle(0, LockFlags.None);
Format format = sprite.Texture.GetLevelDescription(0).Format;
byte[] buffer = new byte[4];
dataRectangle.Data.Read(buffer, ([y] * dataRectangle.Pitch) + ([x] * 4), buffer.Length)开发者_StackOverflow社区;
texture.UnlockRectangle(0);
I tried different pixels but all seem to give bogus data. For example, I actually tried using my current avatar to see if the buffer I got from the DataRectangle matches the actual pixel in the image, but no luck (even checked if the Format was correct, which it is).
What am I doing wrong? Is there a better way to do it? Or is my UV story wrong and can it be solved much simpler than cutting out a source-rectangle before tiling it?
Thank you for your time,
Lennard Fonteijn
Update #1
I actually managed to export the pixel data to a Bitmap using the following conversion from a byte array:
int pixel = (buffer[0] & 0xFF) | ((buffer[1] & 0xFF) << 8) | ((buffer[2] & 0xFF) << 16) | ((255 - buffer[3] & 0xFF) << 24);
So the data doesn't seem so bogus as I thought it was. My next problem, however, is grabbing the pixels specified in the source rectangle and copy them to a new texture. The image I'm trying to cut is 150x150, but for some reason it is stretched to a 256x256 image (power of two), but when actually trying to access pixels beyond 150x150, it throws an OutOfBounds exception. Also, when I actually try to create a second blank texture of size 256x256, no matter what I put into it, it turns out completely black.
Here's my current code:
//Texture texture = 150x150
DataRectangle dataRectangle = texture.LockRectangle(0, LockFlags.None);
SurfaceDescription surface = texture.GetLevelDescription(0);
Texture texture2 = new Texture(_graphicsDevice, surface.Width, surface.Height, 0, surface.Usage, surface.Format, surface.Pool);
DataRectangle dataRectangle2 = texture2.LockRectangle(0, LockFlags.None);
for (int k = sourceX; k < sourceHeight; k++)
{
for (int l = sourceY; l < sourceWidth; l++)
{
byte[] buffer = new byte[4];
dataRectangle.Data.Seek((k * dataRectangle.Pitch) + (l* 4), SeekOrigin.Begin);
dataRectangle.Data.Read(buffer, 0, 4);
dataRectangle2.Data.Seek(((k - sourceY) * dataRectangle2.Pitch) + ((l - sourceX) * 4), SeekOrigin.Begin);
dataRectangle2.Data.Write(buffer, 0, 4);
}
}
sprite.Texture.UnlockRectangle(0);
texture2.UnlockRectangle(0);
_graphicsDevice.SetTexture(0, texture2);
So my new (additional) questions are: How can I move over pixels from one texture to another smaller texture, including the Alpha channel? Any why does the SurfaceDescription report 256x256 when my original texture is 150x150?
It's kind of awkward to answer my own question, but after some more digging and by simple trial and error, I found the solution.
At first, I had to change the way I loaded my texture. To prevent it from internally resizing to a Power-of-Two size, I had to use the following method:
Texture texture = Texture.FromFile(_graphicsDevice, [filePath], D3DX.DefaultNonPowerOf2, D3DX.DefaultNonPowerOf2, 1, Usage.None, Format.Unknown, Pool.Managed, Filter.None, Filter.None, 0);
Note how I specifically specified that the size is Non-Power-of-Two.
Next, there was a mistake in my new texture definition. Instead of specifying 0 levels (and making it auto-generate a MipMap), I had to specify 1 level, like this:
Texture texture2 = new Texture(_graphicsDevice, [sourceWidth], [sourceHeight], 1, surface.Usage, surface.Format, surface.Pool);
After having done that, the for-loop I have in my actual question, works fine:
DataRectangle dataRectangle = texture.LockRectangle(0, LockFlags.None);
SurfaceDescription surface = texture.GetLevelDescription(0);
DataRectangle dataRectangle2 = texture2.LockRectangle(0, LockFlags.None);
for (int y = [sourceX]; y < [sourceHeight]; k++)
{
for (int x = [sourceY]; x < [sourceWidth]; l++)
{
byte[] buffer = new byte[4];
dataRectangle.Data.Seek((y * dataRectangle.Pitch) + (x * 4), SeekOrigin.Begin);
dataRectangle.Data.Read(buffer, 0, 4);
dataRectangle2.Data.Seek(((y - [sourceY]) * dataRectangle2.Pitch) + ((x - [sourceX]) * 4), SeekOrigin.Begin);
dataRectangle2.Data.Write(buffer, 0, 4);
}
}
texture.UnlockRectangle(0);
texture2.UnlockRectangle(0);
_graphicsDevice.SetTexture(0, texture2);
Everything in brackets is considered a variable from outside this snippet. I suppose _graphicsDevice is clear enough. I am aware the .Seek can be simplified, but I think it works fine for example purposes. Note that I do not recommended doing this kind of operation every frame, as it drains your FPS quite fast when used wrong.
It took me quite some time to figure it out, but the result is satisfying. I would like to thank everyone who glimpsed over the question and has tried to help me.
Lennard Fonteijn
精彩评论