开发者

BitConverter.ToInt16 Adds 0xFFFF to Number? (C#)

I've got a problem here that's probably something that I'm just overlooking, but I can't understand why it's happening...

The problem I'm having is that I'm using the bit converter to give me an Int16 from a 2-byte array, but for some reason whenever I do this -- I get the number I should get, with 0xFFFF added to the beginning of the number.

Example...

byte[] ourArray = { 0x88, 0xA3, 0x67, 0x3D };
Int16 CreationDate = BitConverter.ToInt16(new byte[] {ourArray[2], ourArray[3]} , 0);
Int16 CreationTime 开发者_开发问答= BitConverter.ToInt16(new byte[] {ourArray[0], ourArray[1]}, 0);

That will return with "CreationDate" being 0x3d67 (correct), but CreationTime being 0xffffa388.

Would anyone happen to know why this is happening, and a way to correct this?


0xA388 is a negative Int16, so converted to Int32 will give a sign extended negative int with similar value. That 0xFFFF you see is the sign extension (padding with '1' bits). Better use UInt16 and UInt32.


0xffffa388 is not an Int16. Are you sure you're not casting it to some 32-bit type?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜