Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
CODE PROJECT For Those Who Code
  • Home
  • Articles
  • FAQ
Community
  1. Home
  2. General Programming
  3. C / C++ / MFC
  4. #define VERSION 1.0.0 - too many decimal points

#define VERSION 1.0.0 - too many decimal points

Scheduled Pinned Locked Moved C / C++ / MFC
questionvisual-studiohardwareannouncement
17 Posts 6 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • V Vaclav_

    Apparently #define VERSION does not simply replace the VERSION with literal 1.0.0 but the 1.0.0 is analyzed by GCC compiler. Did some search and found that certain "tokens" have special function in #define. Did not quite get which tokens, period is one of them, but like to know if this is something new or specific to GCC ( used by Arduino IDE). Of course it works "normal" if the token is enclosed in parentheses as a string. Any comments will be appreciated. Cheers Vaclav PS What is the correct name for "the stuff" after #define and VERSION?

    L Offline
    L Offline
    Lost User
    wrote on last edited by
    #5

    Vaclav_Sal wrote:

    Apparently #define VERSION   does not simply replace the VERSION with literal 1.0.0 but the 1.0.0 is analyzed by GCC compiler.

    When you use that statement the value after the word VERSION has two decimal points so it is not a valid token, and the compiler rejects it. A #define statement must contain valid C/C++ code, so the value after the identifier must be a valid literal or expression which resolves to a valid literal; see http://msdn.microsoft.com/en-us/library/teas0593.aspx[^].

    P V 2 Replies Last reply
    0
    • V Vaclav_

      Please read the OP and if you do not know the answer do not bother to reply. I did not ask for a lecture why not to use #define.

      P Offline
      P Offline
      PIEBALDconsult
      wrote on last edited by
      #6

      Please read the response.

      A 1 Reply Last reply
      0
      • L Lost User

        Vaclav_Sal wrote:

        Apparently #define VERSION   does not simply replace the VERSION with literal 1.0.0 but the 1.0.0 is analyzed by GCC compiler.

        When you use that statement the value after the word VERSION has two decimal points so it is not a valid token, and the compiler rejects it. A #define statement must contain valid C/C++ code, so the value after the identifier must be a valid literal or expression which resolves to a valid literal; see http://msdn.microsoft.com/en-us/library/teas0593.aspx[^].

        P Offline
        P Offline
        PIEBALDconsult
        wrote on last edited by
        #7

        My test (above) with Borland's compiler had no trouble -- provided I stringized the value. But there are better ways to skin that cat.

        L 1 Reply Last reply
        0
        • L Lost User

          Vaclav_Sal wrote:

          Apparently #define VERSION   does not simply replace the VERSION with literal 1.0.0 but the 1.0.0 is analyzed by GCC compiler.

          When you use that statement the value after the word VERSION has two decimal points so it is not a valid token, and the compiler rejects it. A #define statement must contain valid C/C++ code, so the value after the identifier must be a valid literal or expression which resolves to a valid literal; see http://msdn.microsoft.com/en-us/library/teas0593.aspx[^].

          V Offline
          V Offline
          Vaclav_
          wrote on last edited by
          #8

          Thanks for the link Richard. Here is the reason why it had too many decimal points "The token-string argument consists of a series of tokens, such as keywords, constants, or complete statements." It passes if the token string is just 1.0, but obviously it has to be correctly formatted for printf to display right.

          1 Reply Last reply
          0
          • S Stefan_Lang

            If it's used as a string, then why don't you #define it as a string? If you don't know how to correctly use #define, why do you use it at all? It's bad style anyway! Make it a const string instead:

            const std::string VERSION = "1.0.0";

            There. Works every time. And if the compiler complains, the code that uses it is wrong! That is the advantage of using const instead of #define: the compiler will notify you of usage problems, whereas in case of #define there's no guarantee the compiler will catch a glitch, and if it does, it will likely not point to the right position in your code.

            Vaclav_Sal wrote:

            PS What is the correct name for "the stuff" after #define and VERSION?

            The correct name is "clutter", or more to the point: "stuff that clogs your global namespace". #define symbols have a nasty habit of colliding with variable and function names elsewhere because they pollute the entire global namespace. Just don't use it!

            GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)

            P Offline
            P Offline
            PIEBALDconsult
            wrote on last edited by
            #9

            Stefan_Lang wrote:

            #define symbols ... pollute the entire global namespace

            Ummm... what? They are gone as soon as the preprocessor completes.

            CPalliniC S 2 Replies Last reply
            0
            • V Vaclav_

              Please read the OP and if you do not know the answer do not bother to reply. I did not ask for a lecture why not to use #define.

              CPalliniC Offline
              CPalliniC Offline
              CPallini
              wrote on last edited by
              #10

              That's just rude.

              THESE PEOPLE REALLY BOTHER ME!! How can they know what you should do without knowing what you want done?!?! -- C++ FQA Lite

              In testa che avete, signor di Ceprano?

              1 Reply Last reply
              0
              • P PIEBALDconsult

                Stefan_Lang wrote:

                #define symbols ... pollute the entire global namespace

                Ummm... what? They are gone as soon as the preprocessor completes.

                CPalliniC Offline
                CPalliniC Offline
                CPallini
                wrote on last edited by
                #11

                That's true, of course. Still you may have a clash with a global symbol.

                THESE PEOPLE REALLY BOTHER ME!! How can they know what you should do without knowing what you want done?!?! -- C++ FQA Lite

                In testa che avete, signor di Ceprano?

                P 1 Reply Last reply
                0
                • CPalliniC CPallini

                  That's true, of course. Still you may have a clash with a global symbol.

                  THESE PEOPLE REALLY BOTHER ME!! How can they know what you should do without knowing what you want done?!?! -- C++ FQA Lite

                  P Offline
                  P Offline
                  PIEBALDconsult
                  wrote on last edited by
                  #12

                  That's OK, it'll just rock the casbah. :jig:

                  CPalliniC 1 Reply Last reply
                  0
                  • P PIEBALDconsult

                    That's OK, it'll just rock the casbah. :jig:

                    CPalliniC Offline
                    CPalliniC Offline
                    CPallini
                    wrote on last edited by
                    #13

                    :thumbsup:

                    THESE PEOPLE REALLY BOTHER ME!! How can they know what you should do without knowing what you want done?!?! -- C++ FQA Lite

                    In testa che avete, signor di Ceprano?

                    1 Reply Last reply
                    0
                    • P PIEBALDconsult

                      Please read the response.

                      A Offline
                      A Offline
                      Albert Holguin
                      wrote on last edited by
                      #14

                      I thought that was a pretty good response... :thumbsup:

                      1 Reply Last reply
                      0
                      • P PIEBALDconsult

                        My test (above) with Borland's compiler had no trouble -- provided I stringized the value. But there are better ways to skin that cat.

                        L Offline
                        L Offline
                        Lost User
                        wrote on last edited by
                        #15

                        Same with Microsoft C++, as you would expect.

                        P 1 Reply Last reply
                        0
                        • P PIEBALDconsult

                          Stefan_Lang wrote:

                          #define symbols ... pollute the entire global namespace

                          Ummm... what? They are gone as soon as the preprocessor completes.

                          S Offline
                          S Offline
                          Stefan_Lang
                          wrote on last edited by
                          #16

                          It once took me more than a day to resolve an issue that manifested as some inexplicable and incomprehensible error message somewhere in the depths of the MS-provided STL implementation. In the end it turned out that the #defined symbols min and max from the windows header files managed to wreak so much havoc in the implementation files of std::valarray, that the error messages not only were totally unrecognizable but also pointed to an entirely different point in the code! That's what I mean by cluttering the global namespace: just about anywhere in your code, any macro from a totally unrelated part, has the potential to totally destroy your code to the point where you neither recognize the location nor cause of the problem! Fixing such an issue in a codebase of 3 million lines of code is not fun at all!

                          GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)

                          1 Reply Last reply
                          0
                          • L Lost User

                            Same with Microsoft C++, as you would expect.

                            P Offline
                            P Offline
                            PIEBALDconsult
                            wrote on last edited by
                            #17

                            And VAX/DEC/HP C of course.

                            1 Reply Last reply
                            0
                            Reply
                            • Reply as topic
                            Log in to reply
                            • Oldest to Newest
                            • Newest to Oldest
                            • Most Votes


                            • Login

                            • Don't have an account? Register

                            • Login or register to search.
                            • First post
                              Last post
                            0
                            • Categories
                            • Recent
                            • Tags
                            • Popular
                            • World
                            • Users
                            • Groups