#define VERSION 1.0.0 - too many decimal points
-
Vaclav_Sal wrote:
Apparently #define VERSION does not simply replace the VERSION with literal 1.0.0 but the 1.0.0 is analyzed by GCC compiler.
When you use that statement the value after the word
VERSION
has two decimal points so it is not a valid token, and the compiler rejects it. A#define
statement must contain valid C/C++ code, so the value after the identifier must be a valid literal or expression which resolves to a valid literal; see http://msdn.microsoft.com/en-us/library/teas0593.aspx[^].Thanks for the link Richard. Here is the reason why it had too many decimal points "The token-string argument consists of a series of tokens, such as keywords, constants, or complete statements." It passes if the token string is just 1.0, but obviously it has to be correctly formatted for printf to display right.
-
If it's used as a string, then why don't you #define it as a string? If you don't know how to correctly use #define, why do you use it at all? It's bad style anyway! Make it a const string instead:
const std::string VERSION = "1.0.0";
There. Works every time. And if the compiler complains, the code that uses it is wrong! That is the advantage of using const instead of #define: the compiler will notify you of usage problems, whereas in case of #define there's no guarantee the compiler will catch a glitch, and if it does, it will likely not point to the right position in your code.
Vaclav_Sal wrote:
PS What is the correct name for "the stuff" after #define and VERSION?
The correct name is "clutter", or more to the point: "stuff that clogs your global namespace". #define symbols have a nasty habit of colliding with variable and function names elsewhere because they pollute the entire global namespace. Just don't use it!
GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)
Stefan_Lang wrote:
#define symbols ... pollute the entire global namespace
Ummm... what? They are gone as soon as the preprocessor completes.
-
Please read the OP and if you do not know the answer do not bother to reply. I did not ask for a lecture why not to use #define.
That's just rude.
THESE PEOPLE REALLY BOTHER ME!! How can they know what you should do without knowing what you want done?!?! -- C++ FQA Lite
-
Stefan_Lang wrote:
#define symbols ... pollute the entire global namespace
Ummm... what? They are gone as soon as the preprocessor completes.
That's true, of course. Still you may have a clash with a global symbol.
THESE PEOPLE REALLY BOTHER ME!! How can they know what you should do without knowing what you want done?!?! -- C++ FQA Lite
-
That's true, of course. Still you may have a clash with a global symbol.
THESE PEOPLE REALLY BOTHER ME!! How can they know what you should do without knowing what you want done?!?! -- C++ FQA Lite
That's OK, it'll just rock the casbah. :jig:
-
That's OK, it'll just rock the casbah. :jig:
:thumbsup:
THESE PEOPLE REALLY BOTHER ME!! How can they know what you should do without knowing what you want done?!?! -- C++ FQA Lite
-
Please read the response.
I thought that was a pretty good response... :thumbsup:
-
My test (above) with Borland's compiler had no trouble -- provided I stringized the value. But there are better ways to skin that cat.
-
Stefan_Lang wrote:
#define symbols ... pollute the entire global namespace
Ummm... what? They are gone as soon as the preprocessor completes.
It once took me more than a day to resolve an issue that manifested as some inexplicable and incomprehensible error message somewhere in the depths of the MS-provided STL implementation. In the end it turned out that the #defined symbols
min
andmax
from the windows header files managed to wreak so much havoc in the implementation files of std::valarray, that the error messages not only were totally unrecognizable but also pointed to an entirely different point in the code! That's what I mean by cluttering the global namespace: just about anywhere in your code, any macro from a totally unrelated part, has the potential to totally destroy your code to the point where you neither recognize the location nor cause of the problem! Fixing such an issue in a codebase of 3 million lines of code is not fun at all!GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)
-
And VAX/DEC/HP C of course.