开发者

C# Bitwise Operations on shorts - Why cast to an int?

short BitwiseTest(short value)
{
    short test1 = ((value >> 8) & 0xFF);
    short test2 = unchecked((short)((va开发者_如何学Pythonlue << 8) & 0xFF00));
    return (test1 | test2);
}

The above code is supposed to be a (inefficient) example that swaps the endianness of a short (signed 16-bit integer) in C#.

However the above code will not compile because C# is implicitly casting from a short to an int on both of the following lines:

First case:

short test1 = ((value >> 8) & 0xFF);

Second case:

return (test1 | test2);

Why is this cast taking place? Would I achieve the expected result simply by casting back to a short? Like so:

short BitwiseTest2(short value)
{
    short test1 = (short)((value >> 8) & 0xFF);
    short test2 = unchecked((short)((value << 8) & 0xFF00));
    return ((short)(test1 | test2));
}

If not why not?

Note that I do understand why C# casts a short to an integer when performing a left bit-shift, hence the assignment of the test2 variable.


This is basically answered in another answer (even if the question is quite different) by Eric Lippert himself.

Take a look Integer summing blues, short += short problem

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜