开发者

How can things still work if you even figured the size of data types wrongly?

This is what is in the code:

typedef  unsigned long  int  ub4;   /* unsigned 4-byte quantities */

However:开发者_如何学编程

(gdb) p sizeof(unsigned long  int)
$7 = 8

So unsigned long int actually takes 8 bytes for my CPU.How can things still work when there're such fundamental mistakes??

BTW,why ub4 can't be recognized by gdb??

(gdb) p sizeof(ub4)
No symbol "ub4" in current context.


I'm going to assume that you're talking about C. Please tag your question with the correct language.

Things can still "work" until the developer has code that assumes a ub4 is precisely 4 bytes. Then, such errors can lead to things like buffer overruns (which may not be obviously detectable, or could even go unnoticed for years before randomly manifesting symptoms) ... or they could just happen to not cause symptoms. They're still invoking Undefined Behaviour, though, and should be fixed.

It's also possible that the developer just happened to never really use the type width as an assumption in his code.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜