开发者

Why does NSLog print an extra zero when logging this array?

With these variables:

NSInteger dataStart;
uint64_t dataSize[1];
const unsigned char *beginning;
NSInteger bytesEnd;

...at these values:

dataStart = 499
dataSize[0] = 427
beginning = 9060864
bytesEnd = 9061793

...the following code:

NSLog(@"dataStart = %d, dataSize[0] = %d, beginning = %d, bytesEnd = %d",
        dataStart, dataSize[0], (NSInteger)beginning, bytesEnd);

...sends this to the console:

dataStart = 499, dataSize[0] = 427, beginning = 0, bytesEnd = 9060864

In other words, an extra zero has been inserted after the array, bumping the other variables along. It does this consistently. I'm using xcode 3.2.3. What is happening here?

[Edit for emphasis: It's not just the old favorite of printing a zero where a value should be because a cast is wrong. It's inserting an extra zero, then printing the correct value of beginning where it should print bytesEnd, and no开发者_运维百科t printing bytesEnd.]

Thanks,


Just a guess, but are you compiling in 64-bit mode? The beginning pointer is 64 bits in that case, but NSInteger is 32-bits, so you're only printing the high-order 32-bit bits of the pointer which are zero.

I guess this is a contrived example because no one in their right mind would cast like this. Right?


You have the wrong format selector. %d is not sufficient to print a uint64_t which is what datasize[0] is. Use %ld.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜