开发者

declaring the largest array using size_t

i wanted to declare a very large array. i found that the max size of an array is size_t, which is def开发者_开发技巧ined as UINT_MAX

so i wrote the code like this

int arr[UINT_MAX];

when i compile this, it says overflow in array dimension

but when i write like this

size_t s = UINT_MAX;
int arr[s]; 

it compiles properly. what's the difference


First error: size_t is not necessarily unsigned int, thus its maximum value can be different from the one of unsigned int (UINT_MAX); moreover, in C++ to get informations about the limits of a type you should use std::numeric_limits.

#include <limits>

size_t s=std::numeric_limits<size_t>::max();

Second error: you won't ever get an array so big; since size_t is required to be able to express the biggest size of any object, it should probably big enough to express an object big as the whole address space available to the application, but trying to allocate such a big object would require to dedicate the whole address space to it, which is infeasible; moreover, you're requesting an array of ints that big, which means that it will be UINT_MAX*sizeof(int) bytes big, which will probably be about 4 times the whole address space - clearly nonsense - and by the way sizeof(arr) wouldn't be able to express the size of such object, and in general pointers couldn't even reach the top of that array. The compiler detects these faults and stop you from doing that.

Moreover, I infer that you're trying to allocate that thing on the stack, that is usually much much smaller than all the memory that the application can use, and in general it's not a good idea to allocate big arrays there (you should use the heap for that).

Third error: allocating all that memory doesn't make sense. If you have big memory requirements, you should allocate stuff on the heap, not on the stack, and allocate just the memory you need to play along well with the OS and the other applications (this last consideration do not apply if you're working on embedded systems where you are the only application that is running).

The second snippet in C++ shouldn't even work, since, if that thing is allocated on the stack, you're going nonstandard, since it would be a VLA (available in C99 but strongly rejected from the current and the next C++ standard). However, in that case the code to allocate that array is used at runtime (VLAs in general are not fixed in dimensions), so the check for the compiler is not obvious to do (although I suppose that this thing could be spotted easily by the optimizer, which, if VLA semantic is not different from regular arrays, could optimize away the VLA and try to make a regular array => which would fail for the same reasons I stated).

Long story short: it makes no sense to allocate all that memory (that you couldn't even address), especially on the stack. Use the heap and allocate just what you need. If you have special requirements, you should investigate the special virtual memory functions provided by your OS.


You are delaying the error.

You are asking for about 16GB* of contiguous memory in both cases, which is impossible on a 32 bit machine.

Your first attempt is hard-coding the size, and your compiler was nice enough to tell you in advance that it will not succeed.

Your second attempt is using a variable for the size, which has bypassed the compiler warning, but it will still fail when you attempt to run the program.

*On typical architectures


size_t s = UINT_MAX;
int arr[s];  

won't compile unless you declare s as const. Aslo note that UINT_MAX is the potentially greatest size of the array. Practically it won't let you declare an array more than a few million in size. That's because static and automatic and any memory is limited


What compiler are you using? On VC++, I get an error in both cases (after correcting s to be const). Even if it did compile, it would result in undefined behaviour because UINT_MAX * sizeof(int) certainly won't fit in your process' address space, and further the integer value itself would overflow and result in the wrong value for size.


size_t s = UINT_MAX;
int arr[s];

means arr is a variable length array (VLA). I think it is not allowed per C++ standard. I would expect an warning if compiled with

g++ -ansi -pedantic -std=c++98

Also, think about it, arr needs UINT_MAX * sizeof( int ) number of bytes. Thats quite big!


for each unsigned integral type, the maximum should be represented by -1. For example, on a 64-bit system, the following code

unsigned char uchar_max = -1;
printf("%u\n", uchar_max);
unsigned int uint_max = -1;
printf("%u\n", uint_max);
unsigned long ulong_max = -1;
printf("%lu\n", ulong_max);
size_t sizet_max = -1;
printf("%lu\n", sizet_max);

returns:

255
4294967295
18446744073709551615
18446744073709551615
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜