C: Why does integer act as long integer?
From what I know an integer is in the range of 32,768 to 32,767. A long integer is in the range of 2,147,483,648 to 2,147,483,647. I doubledchecked wikipedeia to make sure.
The problem now:
int a=2147483647;
printf("a: %d\n", a);
Why does this work?开发者_运维技巧 If I add 1 to 2147483647 then it prints garbage, something to be expected if the variable is a long integer. But why does it allow me to assign a long integer number to an integer in the first place?
From what I know an integer is in the range of 32,768 to 32,767.
This is incorrect. The range of int
is at least -32,767 to 32,767; it might have greater range and on most 32-bit and 64-bit platforms it does.
You can find out the range of the int
type on your platform by checking the INT_MAX
and INT_MIN
macros defined in <limits.h>
.
What you know is wrong.
Unlike some other languages like Java, the exact size of int
and long
in C is implementation defined (withing set limits). Clearly, you are using a platform where int
is 32 bits wide, but on other platforms it may be 16 bits, and on some platforms int
is 64 bits wide.
Similarly, on some platforms, long
is 32 bits wide; on others it is 64 bits wide. It could be wider still if a platform chose to make it so.
The real width of an int
is generally implementation-dependent. You can use the INT_MAX/INT_MIN
constants in the limits.h
header to find out what the real range is.
On a x86 machine, an int
is typically 4 bytes, so 2^31-1 is indeed the maximum value it can have.
sizeof(short) <= sizeof(int) <= sizeof(long)
精彩评论