开发者

Why use hex for array indexes

I was recently using .net reflector to look at some DLL files, and I noticed one was using hex instead of decimal for its array indexes.

public Random(int Seed)
{
    this.SeedArray = new int[0x38];
    int num2 = 0x9a4ec86 - Math.Abs(Seed);
    this.SeedArray[0x37] = num2;
    int num3 = 1;
    for (int i = 1; i < 0x37; i++)
    {
        int index = (0x15 * i) % 0x37;
        this.SeedArray[index] = num3;
        num3 = num2 - num3开发者_开发问答;
        if (num3 < 0)
        {
            num3 += 0x7fffffff;
        }
        num2 = this.SeedArray[index];
    }
    for (int j = 1; j < 5; j++)
    {
        for (int k = 1; k < 0x38; k++)
        {
            this.SeedArray[k] -= this.SeedArray[1 + ((k + 30) % 0x37)];
            if (this.SeedArray[k] < 0)
            {
                this.SeedArray[k] += 0x7fffffff;
            }
        }
    }
    this.inext = 0;
    this.inextp = 0x15;
    Seed = 1;
}

Why did the author use hex instead of decimal?


Whether an integer literal was originally written in source-code as hex or decimal (or, say, ternary, assuming the programming language supported it) doesn't get written into the IL. What you're seeing is a guess made by the decompiler.

In fact, you might be able to see all of those literals in decimal by setting:

View -> Options -> Disassembler -> Number Format : Decimal
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜