#defining constants in C++
In various C code, I see constants defined like this:
#开发者_如何转开发define T 100
Whereas in C++ examples, it is almost always:
const int T = 100;
It is my understanding that in the first case, the preprocessor will replace every instance of T with 100. In the second example, T is actually stored in memory.
Is it considered bad programming practice to #define constants in C++?
Is it considered bad programming practice to #define constants in C++?
Yes, because all macros (which are what #define
s define) are in a single namespace and they take effect everywhere. Variables, including const
-qualified variables, can be encapsulated in classes and namespaces.
Macros are used in C because in C, a const
-qualified variable is not actually a constant, it is just a variable that cannot be modified. A const
-qualified variable cannot appear in a constant expression, so it can't be used as an array size, for example.
In C++, a const
-qualified object that is initialized with a constant expression (like const int x = 5 * 2;
) is a constant and can be used in a constant expression, so you can and should use them.
There is no requirement that T
be stored "in memory" in the second case, unless you do something like take the address of it. This is true of all variables.
The reason the second one is better is that the first will frequently "disappear" in the pre-processor phase so that the compiler phase never sees it (and hence doesn't give it to you in debug information). But that behaviour is not mandated by the standard, rather it's common practice.
There's little need to use #define
statements any more other than for conditional compilation. Single constants can be done with const
, multiple related constants can be done with enum
and macros can be replaced with inline
functions.
Due to the differences between the concepts of constant in C and C++, in C we are basically forced to use #define
(or enum
) most of the time. const
just doesn't work in C in most cases.
But in C++ there's no such problem, so it is indeed bad practice to rely on #define
d constants in C++ (unless you really need a textually-substituted constant for some reason).
Preprocessor macros do not respect the scope - it's a simple text substitution - while static const int blah = 1;
can be enclosed in a namespace. The compiler will still optimize both cases (unless you take address of that variable) but it's type- and scope-safe.
Yes. At the very least, use enums. Both const int
and enum
s will be evaluated at compile-time, so you have the same performance. However, it's much cleaner, will be easier to debug (the debugger will actually know what T is), it's type-safe, and less likely to break in complex expressions.
Yes. The biggest reason is that preprocessor definitions do not obey the scoping rules of the language, polluting the global namespace, and worse -- they're even replaced in cases like
x->sameNameAsPreprocessorToken
Since preprocessor definitions are replaced at the textual level, other normal properties of variables do not apply - you can take the address of an int const, but not of a #define'd constant.
As noted by others, you also typically lose type safety and debugging ability.
One other cool point is that global integral constants could be optimized out by the compiler so that they do not take up any space (i.e., memory). Therefore, they can be treated as literal constants when they are used and be as optimal as #define
based constants, without all of the pre-processor issues.
精彩评论