Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Hungarian notation

Hungarian notation

Scheduled Pinned Locked Moved The Lounge
csharpcomagentic-aijsonquestion
56 Posts 26 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C Offline
    C Offline
    Christopher Duncan
    wrote on last edited by
    #1

    I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

    Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

    M N F J C 18 Replies Last reply
    0
    • C Christopher Duncan

      I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

      Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

      M Offline
      M Offline
      Michael P Butler
      wrote on last edited by
      #2

      I guess since you can now just hover your mouse over a variable and a tooltip will show you the info you need, then Hungarian notion because a little redundant.

      Michael CP Blog [^] Development Blog [^]

      C 1 Reply Last reply
      0
      • C Christopher Duncan

        I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

        Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

        N Offline
        N Offline
        Nish Nishant
        wrote on last edited by
        #3

        Christopher Duncan wrote:

        However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc.

        If the variables are properly named, you wouldn't really have to look up their type. employeeName is not going to be a double, and itemCount is not going to be a string. I was so used to a quasi-Hungarian notation, that I used to use it in my C# code too. But nowadays I am making a strong attempt to use .NET naming guidelines when writing C# - once in a while I still goof up and end up writing Hungarian.

        Regards, Nish


        Nish’s thoughts on MFC, C++/CLI and .NET (my blog)
        Currently working on C++/CLI in Action for Manning Publications. (*Sample chapter available online*)

        1 Reply Last reply
        0
        • C Christopher Duncan

          I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

          Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

          F Offline
          F Offline
          Frank Kerrigan
          wrote on last edited by
          #4

          Yes everyone has gone all French with Pascal case :-D

          Grady Booch: I told Google to their face...what you need is some serious adult supervision. (2007 Turing lecture) http:\\www.frankkerrigan.com

          1 Reply Last reply
          0
          • C Christopher Duncan

            I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

            Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

            J Offline
            J Offline
            Jim Crafton
            wrote on last edited by
            #5

            Slow day at the office Chris? Looking for a quick flamewar? :)

            Christopher Duncan wrote:

            Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular?

            That of course is most certainly a matter of opinion. My opinion is that it's: a) neither straightforward or easy b) can easily get out of date, i.e. the int changes to a double, the char* changes to a string object, etc. c) it makes variable names look ugly - it's just not aesthetically pleasing to the eye d) everyone seems to have their own variations on it My rule is that if you can't figure out what the variable is from it's name, then it's probably got a bad name, and hungarian notation won't help much here, other than to add more letters to the name :)

            ¡El diablo está en mis pantalones! ¡Mire, mire! Real Mentats use only 100% pure, unfooled around with Sapho Juice(tm)! SELECT * FROM User WHERE Clue > 0 0 rows returned Save an Orange - Use the VCF! Techno Silliness

            C 1 Reply Last reply
            0
            • C Christopher Duncan

              I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

              Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

              C Offline
              C Offline
              Chris Losinger
              wrote on last edited by
              #6

              MS said Jump Left, they all jumped left. MS said Jump Right, they all jumped right.

              image processing toolkits | batch image processing | blogging

              J 1 Reply Last reply
              0
              • C Christopher Duncan

                I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

                Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

                P Offline
                P Offline
                peterchen
                wrote on last edited by
                #7

                So which type of Hungarian Notation do you mean?[^]


                Developers, Developers, Developers, Developers, Developers, Developers, Velopers, Develprs, Developers!
                We are a big screwed up dysfunctional psychotic happy family - some more screwed up, others more happy, but everybody's psychotic joint venture definition of CP
                Linkify!|Fold With Us!

                1 Reply Last reply
                0
                • C Christopher Duncan

                  I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

                  Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

                  Z Offline
                  Z Offline
                  zoid
                  wrote on last edited by
                  #8

                  In my opinion Hungarian notation was a bad idea to begin with. If I have make a class called Address, how do I define a variable of that type? adMyAddress? addrMyAddress? It is totally arbitrary anyway.. For built in types such as int, char, etc. it is annoying to. Suppose you code up your application using a variable "total" and decide to declare it as an int. So you have a variable called iTotal. Suppose later a requirement comes to make it a float. Now every occurance in the code needs to be changed to fTotal. It adds unnecessary overhead in my opinion, and makes your variable names ugly.

                  C D 2 Replies Last reply
                  0
                  • C Christopher Duncan

                    I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

                    Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

                    S Offline
                    S Offline
                    Shog9 0
                    wrote on last edited by
                    #9

                    Christopher Duncan wrote:

                    Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular?

                    Suddenly? I've hated it since i first saw it - any excuse to ditch it is fine by me... FWIW: the way i heard it explained, The Mad Hungarian originally came up with The Notation as a way to convey meaning as to how the variable would be used. So integers that store coordinates get a different prefix than integers storing measurements which are different than loop counters... This actually makes a bit of sense, if you can be consistent. But the number of times i've seen that done correctly and consistently... well, i could probably count it on the fingers of one foot. Add in all the shitty code out there using incorrect or misleading prefixes, and it becomes an active hindrance. Also, it isn't really Intellisense friendly.

                    ---- Scripts i’ve known... CPhog 1.8.2 - make CP better. Forum Bookmark 0.2.5 - bookmark forum posts on Pensieve Print forum 0.1.2 - printer-friendly forums Expand all 1.0 - Expand all messages In-place Delete 1.0 - AJAX-style post delete Syntax 0.1 - Syntax highlighting for code blocks in the forums

                    P K S 3 Replies Last reply
                    0
                    • S Shog9 0

                      Christopher Duncan wrote:

                      Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular?

                      Suddenly? I've hated it since i first saw it - any excuse to ditch it is fine by me... FWIW: the way i heard it explained, The Mad Hungarian originally came up with The Notation as a way to convey meaning as to how the variable would be used. So integers that store coordinates get a different prefix than integers storing measurements which are different than loop counters... This actually makes a bit of sense, if you can be consistent. But the number of times i've seen that done correctly and consistently... well, i could probably count it on the fingers of one foot. Add in all the shitty code out there using incorrect or misleading prefixes, and it becomes an active hindrance. Also, it isn't really Intellisense friendly.

                      ---- Scripts i’ve known... CPhog 1.8.2 - make CP better. Forum Bookmark 0.2.5 - bookmark forum posts on Pensieve Print forum 0.1.2 - printer-friendly forums Expand all 1.0 - Expand all messages In-place Delete 1.0 - AJAX-style post delete Syntax 0.1 - Syntax highlighting for code blocks in the forums

                      P Offline
                      P Offline
                      PIEBALDconsult
                      wrote on last edited by
                      #10

                      Hear Hear! It was never a good idea.

                      1 Reply Last reply
                      0
                      • Z zoid

                        In my opinion Hungarian notation was a bad idea to begin with. If I have make a class called Address, how do I define a variable of that type? adMyAddress? addrMyAddress? It is totally arbitrary anyway.. For built in types such as int, char, etc. it is annoying to. Suppose you code up your application using a variable "total" and decide to declare it as an int. So you have a variable called iTotal. Suppose later a requirement comes to make it a float. Now every occurance in the code needs to be changed to fTotal. It adds unnecessary overhead in my opinion, and makes your variable names ugly.

                        C Offline
                        C Offline
                        Chris Losinger
                        wrote on last edited by
                        #11

                        zoid ! wrote:

                        Suppose later a requirement comes to make it a float. Now every occurance in the code needs to be changed to fTotal.

                        and how often do you do that without then going to every place you use the variable, to make sure you're not losing precision, generating overflows, truncating, etc ? or do you just change the type and hope for the best ? point is: changing the type is not a one-spot change. you should be revisiting all the code that uses that variable, which gives you a chance to change the name, while you're at it. and when you get right down to it, changing the name is going to make it very easy for you to find all those places, because the compiler is going to angrily point out each and every one of them for you.

                        image processing toolkits | batch image processing | blogging

                        Z 1 Reply Last reply
                        0
                        • C Christopher Duncan

                          I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

                          Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

                          B Offline
                          B Offline
                          Blake Miller
                          wrote on last edited by
                          #12

                          Lately, I have only favored perhaps implying the SCOPE of a variable, so m_ means it is a member variable of an object, s_ means it is static and local to the source file, and g_ means it is global - probably to the entire application. Lets you know what you are dealing with if you go to make changes. Other than that, give it a good name.

                          1 Reply Last reply
                          0
                          • J Jim Crafton

                            Slow day at the office Chris? Looking for a quick flamewar? :)

                            Christopher Duncan wrote:

                            Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular?

                            That of course is most certainly a matter of opinion. My opinion is that it's: a) neither straightforward or easy b) can easily get out of date, i.e. the int changes to a double, the char* changes to a string object, etc. c) it makes variable names look ugly - it's just not aesthetically pleasing to the eye d) everyone seems to have their own variations on it My rule is that if you can't figure out what the variable is from it's name, then it's probably got a bad name, and hungarian notation won't help much here, other than to add more letters to the name :)

                            ¡El diablo está en mis pantalones! ¡Mire, mire! Real Mentats use only 100% pure, unfooled around with Sapho Juice(tm)! SELECT * FROM User WHERE Clue > 0 0 rows returned Save an Orange - Use the VCF! Techno Silliness

                            C Offline
                            C Offline
                            Christopher Duncan
                            wrote on last edited by
                            #13

                            Jim Crafton wrote:

                            Looking for a quick flamewar?

                            Not at all! Just something I've been wondering about for quite some time now, figured this would be the most qualified group to ask. I probably should have added that just like the choice of languages and programmers editors, naming conventions are also a religious issue with no "right" or "wrong" to them. That said, I'm willing to bet that if the MS C# team had come up with a variable naming convention that required each variable to start with the number 42 that it would soon be the prevalent method used, and anything else would be considered old fashioned and uncool. :)

                            Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

                            P 1 Reply Last reply
                            0
                            • C Christopher Duncan

                              I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

                              Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

                              F Offline
                              F Offline
                              Fernando A Gomez F
                              wrote on last edited by
                              #14

                              I like Hungarian notation, and I use it in my C++ code. However, when I began to program in C#, I realized that it was OK to use HN for numbers and strings. Yet I found that using the "obj" prefix was redundant... since everything is an object. So I think that HN is kinda useless in C# except for numeric variables, since I can either begin to create new prefixes for many classes (i.e. strm, xml, evnt) and give birth to a monster, or I can simply avoid HN. The solution for me was obvious though: I stopped programming for C# and returned to good C++ :D

                              A polar bear is a bear whose coordinates has been changed in terms of sine and cosine. Personal Site

                              1 Reply Last reply
                              0
                              • M Michael P Butler

                                I guess since you can now just hover your mouse over a variable and a tooltip will show you the info you need, then Hungarian notion because a little redundant.

                                Michael CP Blog [^] Development Blog [^]

                                C Offline
                                C Offline
                                Christopher Duncan
                                wrote on last edited by
                                #15

                                A reasonable point. However, not everyone writes code in the IDE. In fact, I'm continually surprised that anyone does. :)

                                Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

                                C M R 3 Replies Last reply
                                0
                                • C Christopher Duncan

                                  Jim Crafton wrote:

                                  Looking for a quick flamewar?

                                  Not at all! Just something I've been wondering about for quite some time now, figured this would be the most qualified group to ask. I probably should have added that just like the choice of languages and programmers editors, naming conventions are also a religious issue with no "right" or "wrong" to them. That said, I'm willing to bet that if the MS C# team had come up with a variable naming convention that required each variable to start with the number 42 that it would soon be the prevalent method used, and anything else would be considered old fashioned and uncool. :)

                                  Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

                                  P Offline
                                  P Offline
                                  PIEBALDconsult
                                  wrote on last edited by
                                  #16

                                  Not at all; unless the compiler enforces it (in which case it's not a "convention" anyway), you can do as you wish.

                                  1 Reply Last reply
                                  0
                                  • Z zoid

                                    In my opinion Hungarian notation was a bad idea to begin with. If I have make a class called Address, how do I define a variable of that type? adMyAddress? addrMyAddress? It is totally arbitrary anyway.. For built in types such as int, char, etc. it is annoying to. Suppose you code up your application using a variable "total" and decide to declare it as an int. So you have a variable called iTotal. Suppose later a requirement comes to make it a float. Now every occurance in the code needs to be changed to fTotal. It adds unnecessary overhead in my opinion, and makes your variable names ugly.

                                    D Offline
                                    D Offline
                                    David Crow
                                    wrote on last edited by
                                    #17

                                    zoid ! wrote:

                                    Suppose you code up your application using a variable "total" and decide to declare it as an int. So you have a variable called iTotal. Suppose later a requirement comes to make it a float. Now every occurance in the code needs to be changed to fTotal.

                                    You're going to have to change all those references to a float anyway, so changing the name of the variable does not add any more work.


                                    "Approved Workmen Are Not Ashamed" - 2 Timothy 2:15

                                    "Judge not by the eye but by the heart." - Native American Proverb

                                    Z 1 Reply Last reply
                                    0
                                    • C Christopher Duncan

                                      I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?

                                      Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com

                                      1 Offline
                                      1 Offline
                                      123 0
                                      wrote on last edited by
                                      #18

                                      Christopher Duncan wrote:

                                      Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular?

                                      Because it was unnatural to begin with. It's the compiler's job - not the programmer's - to keep track of internal data types and to perform appropriate conversions as required. "Crucial information" at a low level is nothing but distracting noise at a high level. Which of these statements, for example, is simpler and clearer: Put the pen in the drawer. vPut aThe nPen pIn aThe nDrawer. Obviously, the first. And this technique - allowing the lower level systems to handle the details of data type recognition and conversion - obviously works or you wouldn't be able to understand this very post.

                                      N 1 Reply Last reply
                                      0
                                      • 1 123 0

                                        Christopher Duncan wrote:

                                        Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular?

                                        Because it was unnatural to begin with. It's the compiler's job - not the programmer's - to keep track of internal data types and to perform appropriate conversions as required. "Crucial information" at a low level is nothing but distracting noise at a high level. Which of these statements, for example, is simpler and clearer: Put the pen in the drawer. vPut aThe nPen pIn aThe nDrawer. Obviously, the first. And this technique - allowing the lower level systems to handle the details of data type recognition and conversion - obviously works or you wouldn't be able to understand this very post.

                                        N Offline
                                        N Offline
                                        Nish Nishant
                                        wrote on last edited by
                                        #19

                                        The Grand Negus wrote:

                                        Put the pen in the drawer. vPut aThe nPen pIn aThe nDrawer.

                                        Drawer->PutPen() would be better than both of them :) Because this doesn't really require a full understanding of English grammar and sentence semantics.

                                        Regards, Nish


                                        Nish’s thoughts on MFC, C++/CLI and .NET (my blog)
                                        Currently working on C++/CLI in Action for Manning Publications. (*Sample chapter available online*)

                                        N 1 2 Replies Last reply
                                        0
                                        • S Shog9 0

                                          Christopher Duncan wrote:

                                          Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular?

                                          Suddenly? I've hated it since i first saw it - any excuse to ditch it is fine by me... FWIW: the way i heard it explained, The Mad Hungarian originally came up with The Notation as a way to convey meaning as to how the variable would be used. So integers that store coordinates get a different prefix than integers storing measurements which are different than loop counters... This actually makes a bit of sense, if you can be consistent. But the number of times i've seen that done correctly and consistently... well, i could probably count it on the fingers of one foot. Add in all the shitty code out there using incorrect or misleading prefixes, and it becomes an active hindrance. Also, it isn't really Intellisense friendly.

                                          ---- Scripts i’ve known... CPhog 1.8.2 - make CP better. Forum Bookmark 0.2.5 - bookmark forum posts on Pensieve Print forum 0.1.2 - printer-friendly forums Expand all 1.0 - Expand all messages In-place Delete 1.0 - AJAX-style post delete Syntax 0.1 - Syntax highlighting for code blocks in the forums

                                          K Offline
                                          K Offline
                                          Kevin McFarlane
                                          wrote on last edited by
                                          #20

                                          Shog9 wrote:

                                          The Mad Hungarian originally came up with The Notation as a way to convey meaning as to how the variable would be used. So integers that store coordinates get a different prefix than integers storing measurements which are different than loop counters...

                                          Yes, I'd read that too. Then everyone, including MS, got the wrong end of the stick and started using it to denote type.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups