Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Hungarian notation

Hungarian notation

Scheduled Pinned Locked Moved The Lounge
csharpcomagentic-aijsonquestion
56 Posts 26 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • N Nemanja Trifunovic

    Nishant Sivakumar wrote:

    Maybe : Pen->PutInDrawer() then

    Hehehe, but the pen is not performing the operation either. I would go with something like:

    Nish->Put(pen, drawer);


    Programming Blog utf8-cpp

    N Offline
    N Offline
    Nish Nishant
    wrote on last edited by
    #36

    Yeah, that looks better :-)

    Regards, Nish


    Nish’s thoughts on MFC, C++/CLI and .NET (my blog)
    Currently working on C++/CLI in Action for Manning Publications. (*Sample chapter available online*)

    1 Reply Last reply
    0
    • D David Crow

      No, .NET and I have yet to cross paths.


      "Approved Workmen Are Not Ashamed" - 2 Timothy 2:15

      "Judge not by the eye but by the heart." - Native American Proverb

      N Offline
      N Offline
      Nish Nishant
      wrote on last edited by
      #37

      DavidCrow wrote:

      No, .NET and I have yet to cross paths.

      Okay, but Chris D was asking about people using (or not using) Hungarian notation with C#. So I was a bit surprised when you replied saying you use that exclusively :-)

      Regards, Nish


      Nish’s thoughts on MFC, C++/CLI and .NET (my blog)
      Currently working on C++/CLI in Action for Manning Publications. (*Sample chapter available online*)

      1 Reply Last reply
      0
      • C Christopher Duncan

        I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

        Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

        M Offline
        M Offline
        Michael Dunn
        wrote on last edited by
        #38

        (too lazy to read the whole thread) The problem is that the original purpose of Hungarian was lost. Hungarian should be about showing the purpose of a variable, which is not necessarily its type (as defined by the language). For example, an int or a DWORD could have a prefix of cch to mean "count of characters." This tells you what the variable is for, and doesn't give any indication of its type. What happened later (and I don't know the timeline exactly) is people forgot (or never learned) about the purpose aspect of Hungarian, and thought it was just an indicator of the variable's type. It's this altered Hungarian that people protest against. Original Hungarian, though, is extremely useful in indicating information that the language can't express or verify. A great example is cb versus cch - count of bytes versus count of characters. The distinction is hugely important when dealing with buffers and Unicode strings.

        --Mike-- Visual C++ MVP :cool: LINKS~! Ericahist | PimpFish | CP SearchBar v3.0 | C++ Forum FAQ Ford, what's this fish doing in my ear?

        1 Reply Last reply
        0
        • C Christopher Duncan

          A reasonable point. However, not everyone writes code in the IDE. In fact, I'm continually surprised that anyone does. :)

          Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

          M Offline
          M Offline
          Michael P Butler
          wrote on last edited by
          #39

          Christopher Duncan wrote:

          A reasonable point. However, not everyone writes code in the IDE. In fact, I'm continually surprised that anyone does.

          I'm surprised. For C# development, the VS2005 IDE can't really be beaten. I certainly wouldn't consider using anything else.

          Michael CP Blog [^] Development Blog [^]

          1 Reply Last reply
          0
          • C Christopher Duncan

            I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

            Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

            C Offline
            C Offline
            Chris Maunder
            wrote on last edited by
            #40

            I think there were licencing issues with using Hungarian so Microsoft had to drop it. I hear they're working on their own version that they'll submit to ECMA.

            cheers, Chris Maunder

            CodeProject.com : C++ MVP

            H C 2 Replies Last reply
            0
            • S Shog9 0

              Christopher Duncan wrote:

              Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular?

              Suddenly? I've hated it since i first saw it - any excuse to ditch it is fine by me... FWIW: the way i heard it explained, The Mad Hungarian originally came up with The Notation as a way to convey meaning as to how the variable would be used. So integers that store coordinates get a different prefix than integers storing measurements which are different than loop counters... This actually makes a bit of sense, if you can be consistent. But the number of times i've seen that done correctly and consistently... well, i could probably count it on the fingers of one foot. Add in all the shitty code out there using incorrect or misleading prefixes, and it becomes an active hindrance. Also, it isn't really Intellisense friendly.

              ---- Scripts i’ve known... CPhog 1.8.2 - make CP better. Forum Bookmark 0.2.5 - bookmark forum posts on Pensieve Print forum 0.1.2 - printer-friendly forums Expand all 1.0 - Expand all messages In-place Delete 1.0 - AJAX-style post delete Syntax 0.1 - Syntax highlighting for code blocks in the forums

              S Offline
              S Offline
              Stuart Dootson
              wrote on last edited by
              #41

              Shog9 wrote:

              The Mad Hungarian originally came up with The Notation as a way to convey meaning as to how the variable would be used. So integers that store coordinates get a different prefix than integers storing measurements which are different than loop counters...

              But if you really want to do that, you use a really strongly typed language like Ada (or you could emulate really strong numeric types in C+) and create a new numeric type for each different sort of number. Naming conventions don't work - compiler enforcement will (until people realise casts exist).

              1 Reply Last reply
              0
              • C Chris Maunder

                I think there were licencing issues with using Hungarian so Microsoft had to drop it. I hear they're working on their own version that they'll submit to ECMA.

                cheers, Chris Maunder

                CodeProject.com : C++ MVP

                H Offline
                H Offline
                Hans Dietrich
                wrote on last edited by
                #42

                Chris Maunder wrote:

                licencing issues with using Hungarian

                Link?

                C 1 Reply Last reply
                0
                • 1 123 0

                  Nishant Sivakumar wrote:

                  Drawer->PutPen() would be better than both of them Because this doesn't really require a full understanding of English grammar and sentence semantics.

                  I hope you're kidding. First of all, show that statement to any non-programmer and see if they don't think the arrow is backwards. Secondly, remember that millions of English speakers who don't have a "full understanding English grammar and sentence semantics" communicate quite effectively, in English, every day. Natural languages work, even when they're poorly used and/or not fully understood by the speakers. That's why everybody uses them. Even you.

                  D Offline
                  D Offline
                  DavidNohejl
                  wrote on last edited by
                  #43

                  The Grand Negus wrote:

                  Natural languages work, even when they're poorly used and/or not fully understood by the speakers. That's why everybody uses them. Even you.

                  Yes, for a fking human-human communication, but for exact describing of data, algorithms and processes there are better tools, e.g. programming languages.


                  "Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus

                  1 1 Reply Last reply
                  0
                  • H Hans Dietrich

                    Chris Maunder wrote:

                    licencing issues with using Hungarian

                    Link?

                    C Offline
                    C Offline
                    Chris Maunder
                    wrote on last edited by
                    #44

                    Notice the joke icon next to my post? ;)

                    cheers, Chris Maunder

                    CodeProject.com : C++ MVP

                    1 Reply Last reply
                    0
                    • D DavidNohejl

                      The Grand Negus wrote:

                      Natural languages work, even when they're poorly used and/or not fully understood by the speakers. That's why everybody uses them. Even you.

                      Yes, for a fking human-human communication, but for exact describing of data, algorithms and processes there are better tools, e.g. programming languages.


                      "Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus

                      1 Offline
                      1 Offline
                      123 0
                      wrote on last edited by
                      #45

                      dnh wrote:

                      Yes, for a fking human-human communication, but for exact describing of data, algorithms and processes there are better tools, e.g. programming languages.

                      We disagree. And we're qualified to comment on the matter because we have described the data, algorithms and processes necessary for a significantly broad and deep application, a complete development system - including unique interface, simplified file manager, hexadecimal dumper, elegant text editor, wysiwyg page editor, and native-code generating compiler - conveniently and efficiently using nothing but Plain English. So until you've done the equivalent, both ways, as we have, perhaps it would be wiser for you to simply withhold judgment.

                      D 1 Reply Last reply
                      0
                      • C Chris Maunder

                        I think there were licencing issues with using Hungarian so Microsoft had to drop it. I hear they're working on their own version that they'll submit to ECMA.

                        cheers, Chris Maunder

                        CodeProject.com : C++ MVP

                        C Offline
                        C Offline
                        Christopher Duncan
                        wrote on last edited by
                        #46

                        :laugh:

                        Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

                        1 Reply Last reply
                        0
                        • C Christopher Duncan

                          I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

                          Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

                          T Offline
                          T Offline
                          Tim Craig
                          wrote on last edited by
                          #47

                          Christopher Duncan wrote:

                          I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation.

                          Did you notice that while they were at it, the C# folks now force their idea of how the curly braces are to be indented? Visual Studio doesn't offer a choice like when the project is C++.

                          The evolution of the human genome is too important to be left to chance idiots like CSS.

                          1 Reply Last reply
                          0
                          • C Christopher Duncan

                            A reasonable point. However, not everyone writes code in the IDE. In fact, I'm continually surprised that anyone does. :)

                            Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

                            R Offline
                            R Offline
                            Rohde
                            wrote on last edited by
                            #48

                            Please tell me why that's surprising for you. I simply don't understand it. :-D What does a normal text editor gives you that VS doesn't? If you don't use any of the bells and whistles of the IDE that's fine, but surely it doesn't provide anything less than any normal editor, so why not just use the IDE as an editor with built-in compiler? Granted VS sucks for C++, but for C# it's really really good.


                            "When you have made evil the means of survival, do not expect men to remain good. Do not expect them to stay moral and lose their lives for the purpose of becoming the fodder of the immoral. Do not expect them to produce, when production is punished and looting rewarded. Do not ask, `Who is destroying the world?' You are."
                            -Atlas Shrugged, Ayn Rand

                            1 Reply Last reply
                            0
                            • 1 123 0

                              dnh wrote:

                              Yes, for a fking human-human communication, but for exact describing of data, algorithms and processes there are better tools, e.g. programming languages.

                              We disagree. And we're qualified to comment on the matter because we have described the data, algorithms and processes necessary for a significantly broad and deep application, a complete development system - including unique interface, simplified file manager, hexadecimal dumper, elegant text editor, wysiwyg page editor, and native-code generating compiler - conveniently and efficiently using nothing but Plain English. So until you've done the equivalent, both ways, as we have, perhaps it would be wiser for you to simply withhold judgment.

                              D Offline
                              D Offline
                              DavidNohejl
                              wrote on last edited by
                              #49

                              The Grand Negus wrote:

                              So until you've done the equivalent, both ways, as we have, perhaps it would be wiser for you to simply withhold judgment.

                              Do you think I never used natural language to describe algorithm?! :mad: While convenience is subjective, I think that thousands of thousands scientist etc who developed and used formal languages for hundreds of years agree with me. Yes, one *can* program using plain English. But it sucks. Only use I can see is to allow people without formal education to program. Wow that's cool. Not.


                              "Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus

                              1 1 Reply Last reply
                              0
                              • D DavidNohejl

                                The Grand Negus wrote:

                                So until you've done the equivalent, both ways, as we have, perhaps it would be wiser for you to simply withhold judgment.

                                Do you think I never used natural language to describe algorithm?! :mad: While convenience is subjective, I think that thousands of thousands scientist etc who developed and used formal languages for hundreds of years agree with me. Yes, one *can* program using plain English. But it sucks. Only use I can see is to allow people without formal education to program. Wow that's cool. Not.


                                "Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus

                                1 Offline
                                1 Offline
                                123 0
                                wrote on last edited by
                                #50

                                I repeat, for your benefit: "Until you've done the equivalent, both ways, as we have, perhaps it would be wiser for you to simply withhold judgment.

                                D 1 Reply Last reply
                                0
                                • 1 123 0

                                  I repeat, for your benefit: "Until you've done the equivalent, both ways, as we have, perhaps it would be wiser for you to simply withhold judgment.

                                  D Offline
                                  D Offline
                                  DavidNohejl
                                  wrote on last edited by
                                  #51

                                  I repeat, I DID use natural language to describe program, and I DID use formal language to describe program.


                                  "Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus

                                  1 1 Reply Last reply
                                  0
                                  • D DavidNohejl

                                    I repeat, I DID use natural language to describe program, and I DID use formal language to describe program.


                                    "Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus

                                    1 Offline
                                    1 Offline
                                    123 0
                                    wrote on last edited by
                                    #52

                                    dnh wrote:

                                    I DID use natural language to describe program, and I DID use formal language to describe program.

                                    Can you send me samples so we can discuss this further with real-life examples? I'm quite sure you've missed some important points. From your profile I gather that you are rather young and inexperienced. I think you'd benefit from further discussion. I'm willing to take the time if you're willing to open your mind to the thought that you might be wrong.

                                    D 1 Reply Last reply
                                    0
                                    • 1 123 0

                                      dnh wrote:

                                      I DID use natural language to describe program, and I DID use formal language to describe program.

                                      Can you send me samples so we can discuss this further with real-life examples? I'm quite sure you've missed some important points. From your profile I gather that you are rather young and inexperienced. I think you'd benefit from further discussion. I'm willing to take the time if you're willing to open your mind to the thought that you might be wrong.

                                      D Offline
                                      D Offline
                                      DavidNohejl
                                      wrote on last edited by
                                      #53

                                      The Grand Negus wrote:

                                      Can you send me samples so we can discuss this further with real-life examples?

                                      By using natural language to describe program I mean spec. Sure I dont have to send you any? Formal language, that would be UML, flow charts, various programming languages...

                                      The Grand Negus wrote:

                                      I'm quite sure you've missed some important points.

                                      Possibly.

                                      The Grand Negus wrote:

                                      From your profile I gather that you are rather young and inexperienced.

                                      Possibly.

                                      The Grand Negus wrote:

                                      I think you'd benefit from further discussion.

                                      Possibly. Idea to be able to "compile" software spec into ready-to-go software is here for quite some time. That's nice idea, but I don't think that (Plain) English is right tool for a job. Try to exactly describe advanced algorithm in English. Then do it in formalized language. You really thing english version is better? Your turn.


                                      "Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus

                                      1 1 Reply Last reply
                                      0
                                      • D DavidNohejl

                                        The Grand Negus wrote:

                                        Can you send me samples so we can discuss this further with real-life examples?

                                        By using natural language to describe program I mean spec. Sure I dont have to send you any? Formal language, that would be UML, flow charts, various programming languages...

                                        The Grand Negus wrote:

                                        I'm quite sure you've missed some important points.

                                        Possibly.

                                        The Grand Negus wrote:

                                        From your profile I gather that you are rather young and inexperienced.

                                        Possibly.

                                        The Grand Negus wrote:

                                        I think you'd benefit from further discussion.

                                        Possibly. Idea to be able to "compile" software spec into ready-to-go software is here for quite some time. That's nice idea, but I don't think that (Plain) English is right tool for a job. Try to exactly describe advanced algorithm in English. Then do it in formalized language. You really thing english version is better? Your turn.


                                        "Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus

                                        1 Offline
                                        1 Offline
                                        123 0
                                        wrote on last edited by
                                        #54

                                        dnh wrote:

                                        Try to exactly describe advanced algorithm in English. Then do it in formalized language. You really thing english version is better? Your turn.

                                        Can we agree that an algorithm for efficient compilation of natural-language source code into native-code executables is an "advanced algorithm"? If so, then the exercise you suggest has been completed and, yes, we really think the English version is better for the following reasons: (1) The natural language version reflects, most closely, what we were thinking about the algorithm itself. It is the most natural expression of the algorithm we can imagine, because, when my son and I labored over the algorithm during its design and implementation, we used many of the same sentences both to discuss and to implement the ideas we were expressing to one another. In other words, we implemented the thing using the same words we employed to discuss it. You can't get any closer to the original thoughts, or more "natural", than that. (2) The natural language version eliminates unnecessary, intermediate steps. Once the appropriate ideas are "put into words" in the usual and natural way, we're essentially done. Those same words can be compiled and run. It's the shortest distance between concept and implementation. Please note, however, three things: (1) We agree that diagramatic approaches to certain problems can be helpful. Some problems are easier to formulate and solve with the left brain, some with the right; most with a combination of the two. Our argument is that the obvious and natural way to present pictures and diagrams is within a natural-language framework. Like the photographs and other illustrations that appear within a largely text-based encyclopedia. Words without visuals can be very effective (think of books and radio and this very message); visuals without words are far less effective (think of television without sound and captions; think of replying to this message using only diagrams and formulae). (2) We agree that specialized, artifical sub-languages can be useful as well. But again, our argument is that the most obvious and natural place for sub-languages to appear is within a natural-language framework. Consider, for example, this[

                                        D 1 Reply Last reply
                                        0
                                        • 1 123 0

                                          dnh wrote:

                                          Try to exactly describe advanced algorithm in English. Then do it in formalized language. You really thing english version is better? Your turn.

                                          Can we agree that an algorithm for efficient compilation of natural-language source code into native-code executables is an "advanced algorithm"? If so, then the exercise you suggest has been completed and, yes, we really think the English version is better for the following reasons: (1) The natural language version reflects, most closely, what we were thinking about the algorithm itself. It is the most natural expression of the algorithm we can imagine, because, when my son and I labored over the algorithm during its design and implementation, we used many of the same sentences both to discuss and to implement the ideas we were expressing to one another. In other words, we implemented the thing using the same words we employed to discuss it. You can't get any closer to the original thoughts, or more "natural", than that. (2) The natural language version eliminates unnecessary, intermediate steps. Once the appropriate ideas are "put into words" in the usual and natural way, we're essentially done. Those same words can be compiled and run. It's the shortest distance between concept and implementation. Please note, however, three things: (1) We agree that diagramatic approaches to certain problems can be helpful. Some problems are easier to formulate and solve with the left brain, some with the right; most with a combination of the two. Our argument is that the obvious and natural way to present pictures and diagrams is within a natural-language framework. Like the photographs and other illustrations that appear within a largely text-based encyclopedia. Words without visuals can be very effective (think of books and radio and this very message); visuals without words are far less effective (think of television without sound and captions; think of replying to this message using only diagrams and formulae). (2) We agree that specialized, artifical sub-languages can be useful as well. But again, our argument is that the most obvious and natural place for sub-languages to appear is within a natural-language framework. Consider, for example, this[

                                          D Offline
                                          D Offline
                                          DavidNohejl
                                          wrote on last edited by
                                          #55

                                          The Grand Negus wrote:

                                          Can we agree that an algorithm for efficient compilation of natural-language source code into native-code executables is an "advanced algorithm"?

                                          yes.

                                          The Grand Negus wrote:

                                          (1) The natural language version reflects, most closely, what we were thinking about the algorithm itself.

                                          Ok, cool. But I'am not that much interested in what you were thinking about algorithm, rather in algorithm itself.

                                          The Grand Negus wrote:

                                          It is the most natural expression of the algorithm we can imagine, because, when my son and I labored over the algorithm during its design and implementation, we used many of the same sentences both to discuss and to implement the ideas we were expressing to one another. In other words, we implemented the thing using the same words we employed to discuss it. You can't get any closer to the original thoughts, or more "natural", than that.

                                          That's way to subjective - I for one always preferred pictures.

                                          The Grand Negus wrote:

                                          (2) The natural language version eliminates unnecessary, intermediate steps. Once the appropriate ideas are "put into words" in the usual and natural way, we're essentially done. Those same words can be compiled and run. It's the shortest distance between concept and implementation.

                                          That sounds great, but to *fully* describe usual software project you will probably end up with 2 meters high tower of papers.

                                          The Grand Negus wrote:

                                          (1) We agree that diagramatic approaches to certain problems can be helpful. Some problems are easier to formulate and solve with the left brain, some with the right; most with a combination of the two. Our argument is that the obvious and natural way to present pictures and diagrams is within a natural-language framework. Like the photographs and other illustrations that appear within a largely text-based encyclopedia. Words without visuals can be very effective (think of books and radio and this very message); visuals without words are far less effective (think of television without sound and captions; think of replying to this message using only diagrams and formulae).

                                          Yes, yes, yes. In COMMUNICATION targeted on HUMAN. Now please describe me snowflake using natural language...

                                          1 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups