开发者

Sizeof Pointer to Array

If I have an array declared like this:

int a[3][2];

then why is:

sizeof(a+0) == 8

whereas:

开发者_如何学Pythonsizeof(a)   == 24

I don't understand how adding 0 to the pointer changes the sizeof output. Is there maybe some implicit type cast?


If you add 0 to a, then a is first converted to a pointer value of type int(*)[2] (pointing to the first element of an array of type int[3][2]). Then 0 is added to that, which adds 0 * sizeof(int[2]) bytes to the address represented by that pointer value. Since that multiplication yields 0, it will yield the same pointer value. Since it is a pointer, sizeof(a+0) yields the size of a pointer, which is 8 bytes on your box.

If you do sizeof(a), there is no reason for the compiler to convert a to a pointer value (that makes only sense if you want to index elements or to do pointer arithmetic involving the address of the elements). So expression a stays being of an array type, and you get the size of int[3][2] instead the size of int(*)[2]. So, 3 * 2 * sizeof(int) which on your box is 24 bytes.

Hope this clarifies things.


sizeof tells you the size of the type of the expression. When you add 0 to a, the type becomes a pointer (8 bytes on 64-bit systems).

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜