stringizing #a in define, why is it bad
#include <stdio.h>
#define print_int(a) printf("%s : %d\n",#a,(a))
int main(void) {
int y = 10;
print_int(y);
return 0;
}
i am taking a class and have been asked to explain why开发者_JAVA百科 this is bad... So i guess stringizing #a is the problem. It does work, so why is it dangerous?
because it bypasses type safety. What happens when someone hates you and goes print_int("5412");
#include <stdio.h>
#define print_int(a) printf("%s : %d\n",#a,(a))
int main(void) {
print_int("1123123");
return 0;
}
outputs
$ gcc test.c
test.c: In function ‘main’:
test.c:4: warning: format ‘%d’ expects type ‘int’, but argument 3 has type ‘char *’
$ ./a.out
"1123123" : 3870
I don't think it's bad at all. The stringtize operator is very useful for writing macros like assertions:
#define assert(x) do { if (!(x)) { printf("assert failed: %s\n", #x); } } while (0)
You an abuse any useful feature. I once had the bright idea to "simplify" Qt Atoms by writing:
#define ATOM(x) (((#x)[0] << 24) | ((#x)[1] << 16) | ...
So you could say ATOM(MPEG)
and get ('M' << 24 | 'P' << 16 | ...)
. In fact, it worked well enough that gcc could produce integer constants from it... Sometimes... Now that was evil!
Preprocessor statements are generally considered evil. Bad things will happen when I say:
int a = 15;
print_int(a++);
精彩评论