开发者

When typecasting to a char in C, which bytes are used to make the character?

When you typecast from an int to a char, you are cutting down the number of bytes used from 4 to 1. How does it pick which byte it is going to use make the char?

Does it take the most significant byte?

Or does it tak开发者_Go百科e the least significant?

Or is there some sort of rule I should know about?


C will take the least-significant byte when doing a narrowing conversion, so if you have the integer value 0xCAFEBABE and you convert it to a char, you'll get the value 0xBE.

Of course, there's no actual guarantee that an int is four bytes or that a char is one, but I'm pretty sure that the logic for doing the truncation will always be the same and will just drop the higher-order bits that don't fit into the char.


If char is signed, it's implementation-defined unless the original value already fits in the range of values for char. An implementation is completely free to generate nonsense (or raise a signal) if it doesn't fit. If char is unsigned (which the standard allows), then the value is reduced modulo 1<<CHAR_BIT (usually 256).

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜