Hungarian notation
-
A reasonable point. However, not everyone writes code in the IDE. In fact, I'm continually surprised that anyone does. :)
Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com
Christopher Duncan wrote:
A reasonable point. However, not everyone writes code in the IDE. In fact, I'm continually surprised that anyone does.
I'm surprised. For C# development, the VS2005 IDE can't really be beaten. I certainly wouldn't consider using anything else.
Michael CP Blog [^] Development Blog [^]
-
I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?
Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com
I think there were licencing issues with using Hungarian so Microsoft had to drop it. I hear they're working on their own version that they'll submit to ECMA.
cheers, Chris Maunder
CodeProject.com : C++ MVP
-
Christopher Duncan wrote:
Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular?
Suddenly? I've hated it since i first saw it - any excuse to ditch it is fine by me... FWIW: the way i heard it explained, The Mad Hungarian originally came up with The Notation as a way to convey meaning as to how the variable would be used. So integers that store coordinates get a different prefix than integers storing measurements which are different than loop counters... This actually makes a bit of sense, if you can be consistent. But the number of times i've seen that done correctly and consistently... well, i could probably count it on the fingers of one foot. Add in all the shitty code out there using incorrect or misleading prefixes, and it becomes an active hindrance. Also, it isn't really Intellisense friendly.
---- Scripts i’ve known... CPhog 1.8.2 - make CP better. Forum Bookmark 0.2.5 - bookmark forum posts on Pensieve Print forum 0.1.2 - printer-friendly forums Expand all 1.0 - Expand all messages In-place Delete 1.0 - AJAX-style post delete Syntax 0.1 - Syntax highlighting for code blocks in the forums
Shog9 wrote:
The Mad Hungarian originally came up with The Notation as a way to convey meaning as to how the variable would be used. So integers that store coordinates get a different prefix than integers storing measurements which are different than loop counters...
But if you really want to do that, you use a really strongly typed language like Ada (or you could emulate really strong numeric types in C+) and create a new numeric type for each different sort of number. Naming conventions don't work - compiler enforcement will (until people realise casts exist).
-
I think there were licencing issues with using Hungarian so Microsoft had to drop it. I hear they're working on their own version that they'll submit to ECMA.
cheers, Chris Maunder
CodeProject.com : C++ MVP
Chris Maunder wrote:
licencing issues with using Hungarian
Link?
-
Nishant Sivakumar wrote:
Drawer->PutPen() would be better than both of them Because this doesn't really require a full understanding of English grammar and sentence semantics.
I hope you're kidding. First of all, show that statement to any non-programmer and see if they don't think the arrow is backwards. Secondly, remember that millions of English speakers who don't have a "full understanding English grammar and sentence semantics" communicate quite effectively, in English, every day. Natural languages work, even when they're poorly used and/or not fully understood by the speakers. That's why everybody uses them. Even you.
The Grand Negus wrote:
Natural languages work, even when they're poorly used and/or not fully understood by the speakers. That's why everybody uses them. Even you.
Yes, for a fking human-human communication, but for exact describing of data, algorithms and processes there are better tools, e.g. programming languages.
"Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus
-
Chris Maunder wrote:
licencing issues with using Hungarian
Link?
-
The Grand Negus wrote:
Natural languages work, even when they're poorly used and/or not fully understood by the speakers. That's why everybody uses them. Even you.
Yes, for a fking human-human communication, but for exact describing of data, algorithms and processes there are better tools, e.g. programming languages.
"Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus
dnh wrote:
Yes, for a fking human-human communication, but for exact describing of data, algorithms and processes there are better tools, e.g. programming languages.
We disagree. And we're qualified to comment on the matter because we have described the data, algorithms and processes necessary for a significantly broad and deep application, a complete development system - including unique interface, simplified file manager, hexadecimal dumper, elegant text editor, wysiwyg page editor, and native-code generating compiler - conveniently and efficiently using nothing but Plain English. So until you've done the equivalent, both ways, as we have, perhaps it would be wiser for you to simply withhold judgment.
-
I think there were licencing issues with using Hungarian so Microsoft had to drop it. I hear they're working on their own version that they'll submit to ECMA.
cheers, Chris Maunder
CodeProject.com : C++ MVP
:laugh:
Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com
-
I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation. When I first started Windows programming Hungarian was indeed strange to get used to. But then, so was the Windows API. However, these days when I look at variable names without it and am left to either guess or search through the code to determine what the variable type is, I find myself thinking that these variable names are only one step removed from the old Basic days of names such as A, B, etc. Why would a straightforward and easy to grasp system of conveying crucial information to the programmer at a glance suddenly become so unpopular? Is there technical reasoning behind it, or is it just a new generation who feels that they must do things differently than those who came before in order to proclaim their identity?
Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com
Christopher Duncan wrote:
I've noticed that the C# folks at Microsoft have promoted a different naming convention that uses no variable type prefix. At the same time, I've observed that it's now trendy for people to dislike Hungarian notation.
Did you notice that while they were at it, the C# folks now force their idea of how the curly braces are to be indented? Visual Studio doesn't offer a choice like when the project is C++.
The evolution of the human genome is too important to be left to chance idiots like CSS.
-
A reasonable point. However, not everyone writes code in the IDE. In fact, I'm continually surprised that anyone does. :)
Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com
Please tell me why that's surprising for you. I simply don't understand it. :-D What does a normal text editor gives you that VS doesn't? If you don't use any of the bells and whistles of the IDE that's fine, but surely it doesn't provide anything less than any normal editor, so why not just use the IDE as an editor with built-in compiler? Granted VS sucks for C++, but for C# it's really really good.
"When you have made evil the means of survival, do not expect men to remain good. Do not expect them to stay moral and lose their lives for the purpose of becoming the fodder of the immoral. Do not expect them to produce, when production is punished and looting rewarded. Do not ask, `Who is destroying the world?' You are."
-Atlas Shrugged, Ayn Rand -
dnh wrote:
Yes, for a fking human-human communication, but for exact describing of data, algorithms and processes there are better tools, e.g. programming languages.
We disagree. And we're qualified to comment on the matter because we have described the data, algorithms and processes necessary for a significantly broad and deep application, a complete development system - including unique interface, simplified file manager, hexadecimal dumper, elegant text editor, wysiwyg page editor, and native-code generating compiler - conveniently and efficiently using nothing but Plain English. So until you've done the equivalent, both ways, as we have, perhaps it would be wiser for you to simply withhold judgment.
The Grand Negus wrote:
So until you've done the equivalent, both ways, as we have, perhaps it would be wiser for you to simply withhold judgment.
Do you think I never used natural language to describe algorithm?! :mad: While convenience is subjective, I think that thousands of thousands scientist etc who developed and used formal languages for hundreds of years agree with me. Yes, one *can* program using plain English. But it sucks. Only use I can see is to allow people without formal education to program. Wow that's cool. Not.
"Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus
-
The Grand Negus wrote:
So until you've done the equivalent, both ways, as we have, perhaps it would be wiser for you to simply withhold judgment.
Do you think I never used natural language to describe algorithm?! :mad: While convenience is subjective, I think that thousands of thousands scientist etc who developed and used formal languages for hundreds of years agree with me. Yes, one *can* program using plain English. But it sucks. Only use I can see is to allow people without formal education to program. Wow that's cool. Not.
"Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus
-
I repeat, for your benefit: "Until you've done the equivalent, both ways, as we have, perhaps it would be wiser for you to simply withhold judgment.
I repeat, I DID use natural language to describe program, and I DID use formal language to describe program.
"Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus
-
I repeat, I DID use natural language to describe program, and I DID use formal language to describe program.
"Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus
dnh wrote:
I DID use natural language to describe program, and I DID use formal language to describe program.
Can you send me samples so we can discuss this further with real-life examples? I'm quite sure you've missed some important points. From your profile I gather that you are rather young and inexperienced. I think you'd benefit from further discussion. I'm willing to take the time if you're willing to open your mind to the thought that you might be wrong.
-
dnh wrote:
I DID use natural language to describe program, and I DID use formal language to describe program.
Can you send me samples so we can discuss this further with real-life examples? I'm quite sure you've missed some important points. From your profile I gather that you are rather young and inexperienced. I think you'd benefit from further discussion. I'm willing to take the time if you're willing to open your mind to the thought that you might be wrong.
The Grand Negus wrote:
Can you send me samples so we can discuss this further with real-life examples?
By using natural language to describe program I mean spec. Sure I dont have to send you any? Formal language, that would be UML, flow charts, various programming languages...
The Grand Negus wrote:
I'm quite sure you've missed some important points.
Possibly.
The Grand Negus wrote:
From your profile I gather that you are rather young and inexperienced.
Possibly.
The Grand Negus wrote:
I think you'd benefit from further discussion.
Possibly. Idea to be able to "compile" software spec into ready-to-go software is here for quite some time. That's nice idea, but I don't think that (Plain) English is right tool for a job. Try to exactly describe advanced algorithm in English. Then do it in formalized language. You really thing english version is better? Your turn.
"Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus
-
The Grand Negus wrote:
Can you send me samples so we can discuss this further with real-life examples?
By using natural language to describe program I mean spec. Sure I dont have to send you any? Formal language, that would be UML, flow charts, various programming languages...
The Grand Negus wrote:
I'm quite sure you've missed some important points.
Possibly.
The Grand Negus wrote:
From your profile I gather that you are rather young and inexperienced.
Possibly.
The Grand Negus wrote:
I think you'd benefit from further discussion.
Possibly. Idea to be able to "compile" software spec into ready-to-go software is here for quite some time. That's nice idea, but I don't think that (Plain) English is right tool for a job. Try to exactly describe advanced algorithm in English. Then do it in formalized language. You really thing english version is better? Your turn.
"Throughout human history, we have been dependent on machines to survive. Fate, it seems, is not without a sense of irony. " - Morpheus
dnh wrote:
Try to exactly describe advanced algorithm in English. Then do it in formalized language. You really thing english version is better? Your turn.
Can we agree that an algorithm for efficient compilation of natural-language source code into native-code executables is an "advanced algorithm"? If so, then the exercise you suggest has been completed and, yes, we really think the English version is better for the following reasons: (1) The natural language version reflects, most closely, what we were thinking about the algorithm itself. It is the most natural expression of the algorithm we can imagine, because, when my son and I labored over the algorithm during its design and implementation, we used many of the same sentences both to discuss and to implement the ideas we were expressing to one another. In other words, we implemented the thing using the same words we employed to discuss it. You can't get any closer to the original thoughts, or more "natural", than that. (2) The natural language version eliminates unnecessary, intermediate steps. Once the appropriate ideas are "put into words" in the usual and natural way, we're essentially done. Those same words can be compiled and run. It's the shortest distance between concept and implementation. Please note, however, three things: (1) We agree that diagramatic approaches to certain problems can be helpful. Some problems are easier to formulate and solve with the left brain, some with the right; most with a combination of the two. Our argument is that the obvious and natural way to present pictures and diagrams is within a natural-language framework. Like the photographs and other illustrations that appear within a largely text-based encyclopedia. Words without visuals can be very effective (think of books and radio and this very message); visuals without words are far less effective (think of television without sound and captions; think of replying to this message using only diagrams and formulae). (2) We agree that specialized, artifical sub-languages can be useful as well. But again, our argument is that the most obvious and natural place for sub-languages to appear is within a natural-language framework. Consider, for example, this[
-
dnh wrote:
Try to exactly describe advanced algorithm in English. Then do it in formalized language. You really thing english version is better? Your turn.
Can we agree that an algorithm for efficient compilation of natural-language source code into native-code executables is an "advanced algorithm"? If so, then the exercise you suggest has been completed and, yes, we really think the English version is better for the following reasons: (1) The natural language version reflects, most closely, what we were thinking about the algorithm itself. It is the most natural expression of the algorithm we can imagine, because, when my son and I labored over the algorithm during its design and implementation, we used many of the same sentences both to discuss and to implement the ideas we were expressing to one another. In other words, we implemented the thing using the same words we employed to discuss it. You can't get any closer to the original thoughts, or more "natural", than that. (2) The natural language version eliminates unnecessary, intermediate steps. Once the appropriate ideas are "put into words" in the usual and natural way, we're essentially done. Those same words can be compiled and run. It's the shortest distance between concept and implementation. Please note, however, three things: (1) We agree that diagramatic approaches to certain problems can be helpful. Some problems are easier to formulate and solve with the left brain, some with the right; most with a combination of the two. Our argument is that the obvious and natural way to present pictures and diagrams is within a natural-language framework. Like the photographs and other illustrations that appear within a largely text-based encyclopedia. Words without visuals can be very effective (think of books and radio and this very message); visuals without words are far less effective (think of television without sound and captions; think of replying to this message using only diagrams and formulae). (2) We agree that specialized, artifical sub-languages can be useful as well. But again, our argument is that the most obvious and natural place for sub-languages to appear is within a natural-language framework. Consider, for example, this[
The Grand Negus wrote:
Can we agree that an algorithm for efficient compilation of natural-language source code into native-code executables is an "advanced algorithm"?
yes.
The Grand Negus wrote:
(1) The natural language version reflects, most closely, what we were thinking about the algorithm itself.
Ok, cool. But I'am not that much interested in what you were thinking about algorithm, rather in algorithm itself.
The Grand Negus wrote:
It is the most natural expression of the algorithm we can imagine, because, when my son and I labored over the algorithm during its design and implementation, we used many of the same sentences both to discuss and to implement the ideas we were expressing to one another. In other words, we implemented the thing using the same words we employed to discuss it. You can't get any closer to the original thoughts, or more "natural", than that.
That's way to subjective - I for one always preferred pictures.
The Grand Negus wrote:
(2) The natural language version eliminates unnecessary, intermediate steps. Once the appropriate ideas are "put into words" in the usual and natural way, we're essentially done. Those same words can be compiled and run. It's the shortest distance between concept and implementation.
That sounds great, but to *fully* describe usual software project you will probably end up with 2 meters high tower of papers.
The Grand Negus wrote:
(1) We agree that diagramatic approaches to certain problems can be helpful. Some problems are easier to formulate and solve with the left brain, some with the right; most with a combination of the two. Our argument is that the obvious and natural way to present pictures and diagrams is within a natural-language framework. Like the photographs and other illustrations that appear within a largely text-based encyclopedia. Words without visuals can be very effective (think of books and radio and this very message); visuals without words are far less effective (think of television without sound and captions; think of replying to this message using only diagrams and formulae).
Yes, yes, yes. In COMMUNICATION targeted on HUMAN. Now please describe me snowflake using natural language...
-
The Grand Negus wrote:
Can we agree that an algorithm for efficient compilation of natural-language source code into native-code executables is an "advanced algorithm"?
yes.
The Grand Negus wrote:
(1) The natural language version reflects, most closely, what we were thinking about the algorithm itself.
Ok, cool. But I'am not that much interested in what you were thinking about algorithm, rather in algorithm itself.
The Grand Negus wrote:
It is the most natural expression of the algorithm we can imagine, because, when my son and I labored over the algorithm during its design and implementation, we used many of the same sentences both to discuss and to implement the ideas we were expressing to one another. In other words, we implemented the thing using the same words we employed to discuss it. You can't get any closer to the original thoughts, or more "natural", than that.
That's way to subjective - I for one always preferred pictures.
The Grand Negus wrote:
(2) The natural language version eliminates unnecessary, intermediate steps. Once the appropriate ideas are "put into words" in the usual and natural way, we're essentially done. Those same words can be compiled and run. It's the shortest distance between concept and implementation.
That sounds great, but to *fully* describe usual software project you will probably end up with 2 meters high tower of papers.
The Grand Negus wrote:
(1) We agree that diagramatic approaches to certain problems can be helpful. Some problems are easier to formulate and solve with the left brain, some with the right; most with a combination of the two. Our argument is that the obvious and natural way to present pictures and diagrams is within a natural-language framework. Like the photographs and other illustrations that appear within a largely text-based encyclopedia. Words without visuals can be very effective (think of books and radio and this very message); visuals without words are far less effective (think of television without sound and captions; think of replying to this message using only diagrams and formulae).
Yes, yes, yes. In COMMUNICATION targeted on HUMAN. Now please describe me snowflake using natural language...
dnh wrote:
That's way to subjective - I for one always preferred pictures.
And that's not subjective?
dnh wrote:
That sounds great, but to *fully* describe usual software project you will probably end up with 2 meters high tower of papers.
I reiterate - we're not against diagramatic (or any other kind of summary). But those "2 meters" of code have to be written in some language. With a natural language, all of that code can be easily understood and executed. The detailed documentation (not the summary) and the source are one and the same. That's a good thing. And you're wrong about the size. Our development system - including interface, file manager, dumper, editor, page-layout facility, and compiler/linker is significantly larger and more complex than most business systems, and yet the entire source code, printed at six lines to the inch, requires a stack of paper less than two inches high (about 5 _centi_meters).
dnh wrote:
Now please describe me snowflake using natural language...
From a dictionary: "one of the soft, light flakes composed of groups of crystals, in which snow falls". But I agree that a picture - or a real sample - would be helpful, especially if labeled ("snowflake") and described ("cold, easily melted, etc"). That was the point of the very paragraph you placed this query underneath. Now, why don't you describe, say, "love" or "courage" or "sin" to me without using words.
dnh wrote:
He's using lot of natural language because he expected readers not to speak math. Try better example.
No, I'm going to stick with this one. First, because most people - by far -aren't "conversant with the mathematical apparatus of theoretical physics" and never will be. We're developing systems for normal people, not specialists. We've already said that specialists can have all the languages they want, as long as they fit them into a natural language framework so we know when they're headed off to one of their "secret meetings" to discuss things that are obscure to the rest of us. And secondly, because even when talking to his peers, Einstein used more natural language than formulae. It's the nature of human beings, however intelligent. Natural languages are truly general purpose communication tools. Perhaps that's why you and I are