开发者

Memory Allocation Problem

This question was asked in the written round of a job interview:

 #include<a开发者_JAVA百科lloc.h>
 #define MAXROW 3
 #define MAXCOL 4

 main()
  {
    int (*p)[MAXCOL];
     p = (int (*)[MAXCOL]) malloc(MAXROW*(sizeof(*p)));
  }

How many bytes are allocated in the process?

To be honest, I did not answer the question. I did not understand the assignment to p.

Can anybody explain me what would be the answer and how it can be deduced?


It's platform dependent.

int (*p)[MAXCOL]; declares an pointer to an array of integers MAXCOL elements wide (MAXCOL of course is 4 in this case). One element of this pointer is therefore 4*sizeof(int) on the target platform.

The malloc statement allocates a memory buffer MAXROW times the size of the type contained in P. Therefore, in total, MAXROW*MAXCOL integers are allocated. The actual number of bytes will depend on the target platform.

Also, there's probably additional memory used by the C runtime (as internal bookeeping in malloc, as well as the various process initialization bits which happen before main is called), which is also completely platform dependant.


p is a pointer to an array of MAXCOL elements of type int, so sizeof *p (parentheses were redundant) is the size of such an array, i.e. MAXCOL*sizeof(int).

The cast on the return value of malloc is unnecessary, ugly, and considered harmful. In this case it hides a serious bug: due to missing prototype, malloc is assumed implicitly to return int, which is incompatible with its correct return type (void *), thus resulting in undefined behavior.


sizeof(*p) will be MAXCOL*sizeof(int). So totally MAXROW*MAXCOL*sizeof(int) number of bytes are alloctaed.


You might want to check out cdecl for help translating C declarations into English. In this instance, int (*p)[4]; becomes declare p as pointer to array 4 of int.


#include<alloc.h>
#define MAXROW 3 
#define MAXCOL 4
main()   {
    int (*p)[MAXCOL];
    p = (int (*)[MAXCOL]) malloc(MAXROW*(sizeof(*p));
}

How many bytes are allocated in the process ?

p is a pointer, so will occupy sizeof(int(*)[MAXCOL]) on the stack, which might look daunting but it's almost always the same as sizeof(void*), sizeof(int*) or any other pointer. Obviously pointer sizes are what give applications their classification as 16-, 32-, 64-etc. bits, and this pointer will be correspondingly sized.

Then p is pointed at some memory obtained from malloc...

malloc( MAXROW * sizeof(*p) )

sizeof(*p) is the size of the int array that p points to, namely sizeof(int) * MAXCOL, so we get

malloc( MAXROW * (sizeof(int) * MAXCOL) )

requested from the heap. For illustrative purposes, if we assume the common 32-bit int size, we're looking at 48 bytes. The actual usage may be rounded up to whatever the heap routines feel like (heap routines often used fixed-sized "buckets" to speed their operations).

To confirm this expectation, simply substitute a logging function for malloc():

#include <stdio.h>

#define MAXROW 3
#define MAXCOL 4

void* our_malloc(size_t n)
{
    printf("malloc(%ld)\n", n);
    return 0;
}

int main()
{
    int (*p)[MAXCOL];
    p = (int (*)[MAXCOL]) our_malloc(MAXROW*(sizeof(*p)));
}

Output on my Linux box:

malloc(48)

The fact that malloc's returned pointer is cast to p's type doesn't affect the amount of memory allocation done.

As R sharply observes, lack of a malloc prototype would cause the compiler to expect malloc to return int rather than the actually-returned void*. In practice, it's probable that the lowest sizeof(int) bytes from the pointer would survive the conversion, and if sizeof(void*) happened to be equal to sizeof(int), or - more tenuous yet - the heap memory happens to start at an address representable in an int despite the size of pointers being larger (i.e. all the truncated bits were 0s anyway), then later dereferencing of the pointer just might work. Cheap plug: C++ won't compile unless it's seen the prototype.

That said, perhaps your alloc.h contains a malloc prototype... I don't have an alloc.h so I guess it's non-Standard.

Any program will also allocate memory for many other things, such as a stack frame providing some context within which main() may be called. The amount of memory for that varies with the compiler, version, compiler flags, operating system etc..


int (*p)[MAXCOL] == int (*p)[4] == "pointer to array 4 of int" (see Note below)

sizeof(*p) would then be what p points to, i.e. 4 * sizeof(int). Multiply that by MAXROW and your final answer is:

12 * sizeof(int)

Note: This is in contrast to:

int *p[MAXCOL] == int *p[4] == "array 4 of pointer to int"
The parentheses make quite a bit of difference!


It should be MAXROW*MAXCOL*sizeof(int) number of bytes


I really dislike questions like this, because I think it's far better as a working engineer to run the experiment than to assume that you know what you are doing - especially if there's reason for suspicion such as a program not working as expected or someone crafting trick questions.

#include <stdlib.h>
#include <stdio.h>
#define MAXROW 3
#define MAXCOL 4

main()
{
  int (*p)[MAXCOL];
  int bytes = MAXROW * (sizeof(*p));
  p = (int (*)[MAXCOL]) malloc(bytes);
  printf("malloc called for %d bytes\n", bytes);
}

On a 32 bit linux system:

gcc test.c
./a.out
malloc called for 48 bytes

(edited to remove pasting accident of multiplying by maxrow twice, yielding mistaken size of 144 bytes)


Running the following in codepad.org:

//#include<alloc.h>
#define MAXROW 3
#define MAXCOL 4

int main()
{
    int (*p)[MAXCOL];
    p = (int (*)[MAXCOL]) malloc(MAXROW*(sizeof(*p)));

    int x = MAXROW*(sizeof(*p));
    printf("%d", x);

    return 0;
}

prints out 48.

Why? Because MAXROW is 3, and sizeof(*p) is 16, so we get 3 * 16.

Why is sizeof(*p) 16? Because MAXCOL is 4, so p is a pointer to an array of 4 ints. Each int is 32 bits = 4 bytes. 4 ints in the array * 4 bytes = 16.

Why is sizeof(*p) not 4? Because it is the size of what p points to, not the size of p. To be the size of p it would have to be sizeof(p), which would be 4, as p is a pointer.

Pedantically you could add:

  1. If the machine is 64 bit (say) the answer would be 96.
  2. As the question states "How many bytes are allocated in the process?", you need to add the 4 bytes for the pointer p.
  3. malloc can allocate more than you ask for (but not less) so the question cannot be answered.
  4. In a similar vein as 2, you could argue that as the process is also loading system dlls such as the C runtime dll in order to run, it is allocating the space for those too. Then you could argue for the space allocated by dlls that are being injected into the process by other (non system) processes, such as those injected by Actual Window Manager and its ilk. But how pedantic do we want to get?

But I think the question is really asking for 48, with possible extra credit for explaining 96.


How about zero bytes because that won't even compile without the non-standard alloc.h. If you can't compile it, you can't run it and if you can't run it, it can't allocate any memory at all.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜