开发者

A question regarding binary conversion to hex for signed bytes

1) I understand that when you're converting binary to decimal the left most bit represents 0, 1...so on. So for example to convert 0001 to decimal it is 0*2开发者_Go百科^0+0*2^1+0*2^2+1*2^3 so the decimal value would be 8.

2) When for example you have signed hex 0x80 which will be converted to binary 1000 0000 however in order to compute the decimal value for this binary representation it is signed so we have to invert 7 bits so we get 1111111 and add 1 which gives us 10000000 which is -128.

My question is why in the second case when we're computing the decimal for the signed byte we had to start from right most bits as 0 so we have ....+1*2^8. Why isn't the 2^0 the left most bit as we computed in 1) for the second case?

Thanks.


No, usually binary is stated the other way...0001 is 1, 1000 is 8.


I answer to point 1, not quite. 0001 is actually 1 while 1000 is 8. You appear to be coming from the wrong end. The binary number 1101, for example would be:

+------ 1 * 2^3     =  8
|+----- 1 * 2^2     =  4
||+---- 0 * 2^1     =  0
|||+--- 1 * 2^0     =  1
||||                  --
1101                = 13

For point 2, the easiest way to turn a bit pattern into a signed number is to first turn it into an unsigned value (0x80 = 128), then subtract the bias (256 for eight bits, 65536 for 16 bits and so on) to get -128.

The bias should only affect the calculation at the end of the process, it's a way to map the range 0..255 to -128..127, or 0..65535 to -32768..32767.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜