开发者

NSIntegerMax vs NSUIntegerMax

NSUInteger index = [self.objects indexOfObject:obj];

if (index == NSNotFound) {
    // Success! Note: NSNotFound internally uses NSIntegerMax
}

if (index == NSUIntegerMax) {
    // Fails!
}

Why? I'm suppose to get an unsigned value as a result of indexOfObject. So, naturally, i was assuming that if th开发者_C百科e object is not found, it will return NSUIntegerMax instead of NSIntegerMax. Is this a bug, or is there a logical explanation for this behavior.


Perhaps NSNotFound can be used in contexts that use NSInteger. It is also safer, in case someone declares index as NSInteger instead of NSUInteger.

At most, one could say that it's odd that NSNotFound is defined as NSIntegerMax, but it's certainly not a bug.


Your assumption is just wrong. It returns an NSNotFound, whatever this is now or will be in the future. And it may easily be different on different platforms (i.e. 32 vs 64 bit).

The reason for its existence is to use it. Don't fool yourself with using other constants that might be the same.


An NSUInteger is unsigned (the 'U') while an NSInteger is signed.

You can still compare the values of an unsigned and a signed integer. Assuming that NSNotFound is actually equal to NSUIntegerMax is probably a mistake. In fact it is defined as equal to NSIntegerMax in NSObjCRuntime.h.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜