开发者

Why do C compilers specify long to be 32-bit and long long to be 64-bit?

Wouldn't it have made more sense to make long 64-bit and reserve long long until 开发者_StackOverflow 128-bit numbers become a reality?


Yes, it does make sense, but Microsoft had their own reasons for defining "long" as 32-bits.

As far as I know, of all the mainstream systems right now, Windows is the only OS where "long" is 32-bits. On Unix and Linux, it's 64-bit.

All compilers for Windows will compile "long" to 32-bits on Windows to maintain compatibility with Microsoft.

For this reason, I avoid using "int" and "long". Occasionally I'll use "int" for error codes and booleans (in C), but I never use them for any code that is dependent on the size of the type.


The c standard have NOT specified the bit-length of primitive data type, but only the least bit-length of them. So compilers can have options on the bit-length of primitive data types. On deciding the bit-length of each primitive data type, the compiler designer should consider the several factors, including the computer architecture.

here is some references: http://en.wikipedia.org/wiki/C_syntax#Primitive_data_types


For historical reasons. For a long time (pun intended), "int" meant 16-bit; hence "long" as 32-bit. Of course, times changed. Hence "long long" :)

PS:

GCC (and others) currently support 128 bit integers as "(u)int128_t".

PPS:

Here's a discussion of why the folks at GCC made the decisions they did:

http://www.x86-64.org/pipermail/discuss/2005-August/006412.html


For the history, including why UNIX systems generally converged on LP64, and why Windows did not (big code base that had int 16 and long 32), and the various arguments: The Long Road to 64 Bits - Double, double, toil and trouble—Shakespeare, Macbeth https://queue.acm.org/detail.cfm?id=1165766 Queue 2006 OR https://dl.acm.org/doi/pdf/10.1145/1435417.1435431 CACM 2009

Note: I helped design the 64/32-bit MIPS R4000, suggested the idea that led to <inttypes.h>, and wrote the long long motivation section for C99.


Ever since the days of the first C compiler for a general-purpose reprogrammable microcomputer, it has often been necessary for code to make use of types that held exactly 8, 16, or 32 bits, but until 1999 the Standard didn't explicitly provide any way for programs to specify that. On the other hand, nearly all compilers for 8-bit, 16-bit, and 32-bit microcomputers define "char" as 8 bits, "short" as 16 bits, and "long" as 32 bits. The only difference among them is whether "int" is 16 bits or 32.

While a 32-bit or larger CPU could use "int" as a 32-bit type, leaving "long" available as a 64-bit type, there is a substantial corpus of code which expects that "long" will be 32 bits. While the C Standard added "fixed-sized" types in 1999, there are other places in the Standard which still use "int" and "long", such as "printf". While C99 added macros to supply the proper format specifiers for fixed-sized integer types, there is a substantial corpus of code which expects that "%ld" is a valid format specifier for int32_t, since it will work on just about any 8-bit, 16-bit, or 32-bit platform.

Whether it makes more sense to have "long" be 32 bits, out of respect for an existing code base going back decades, or 64 bits, so as to avoid the need for the more verbose "long long" or "int64_t" to identify the 64-bit types is probably a judgment call, but given that new code should probably favor the use of specified-size types when practical, I'm not sure I see a compelling advantage to making "long" 64 bits unless "int" is also 64 bits (which will create even bigger problems with existing code).


d 32-bit microcomputers define "char" as 8 bits, "short" as 16 bits, and "long" as 32 bits. The only difference among them is whether "int" is 16 bits or 32.

While a 32-bit or larger CPU could use "int" as a 32-bit type, leaving "long" available as a 64-bit type, there is a substantial corpus of code which expects that "long" will be 32 bits. While the C Standard added "fixed-sized" types in 1999, there are other places in the Standard which still use "int" and "long", such as "printf". While C99 added macros to supply the


C99 N1256 standard draft

Sizes of long and long long are implementation defined, all we know are:

  • minimum size guarantees
  • relative sizes between the types

5.2.4.2.1 Sizes of integer types <limits.h> gives the minimum sizes:

1 [...] Their implementation-defined values shall be equal or greater in magnitude (absolute value) to those shown [...]

  • UCHAR_MAX 255 // 2 8 − 1
  • USHRT_MAX 65535 // 2 16 − 1
  • UINT_MAX 65535 // 2 16 − 1
  • ULONG_MAX 4294967295 // 2 32 − 1
  • ULLONG_MAX 18446744073709551615 // 2 64 − 1

6.2.5 Types then says:

8 For any two integer types with the same signedness and different integer conversion rank (see 6.3.1.1), the range of values of the type with smaller integer conversion rank is a subrange of the values of the other type.

and 6.3.1.1 Boolean, characters, and integers determines the relative conversion ranks:

1 Every integer type has an integer conversion rank defined as follows:

  • The rank of long long int shall be greater than the rank of long int, which shall be greater than the rank of int, which shall be greater than the rank of short int, which shall be greater than the rank of signed char.
  • The rank of any unsigned integer type shall equal the rank of the corresponding signed integer type, if any.
  • For all integer types T1, T2, and T3, if T1 has greater rank than T2 and T2 has greater rank than T3, then T1 has greater rank than T3
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜