Into bad habits
-
Strict languages help a lot - if you listen to the warnings they give. (Personally I run with "treat warnings as errors" set on). For example:
if(myInt & 0x80 != 0)
in C# is an error - the result is not bool. In C++ it just doesn't do what you expect... This one caught me in an answer in Q&A because I had got so used to C# telling me off that I assumed the operator precedence was correct when C++ didn't give me a warning or error...Ideological Purity is no substitute for being able to stick your thumb down a pipe to stop the water
That's why I never bother learning the precedence rules in a programming language, and fully parenthesize expressions. Yeah, the hoi polloi will look down their noses at my pansy-assed, belt-and-suspenders way of doing things, but then I don't have to debug stupid mistakes like in your expression.
if ((myInt & 0x80) != 0)
My mistakes are far more intelligent, and only make me want to stab someone in the face with a spoon. For example, today I had a project that kept getting unresolved references at link time, even though the import library was in the correct location, and a number of other projects linked to it with no problems. I spent four hours playing with linker options, copying the import library to various locations, and so on, before I finally found the problem. The compiler on the DLL project was set to not treat
wchar_t
as a builtin type, while the compiler on project using the DLL was set to do so. The function signatures in the DLL therefore didn't match because of name mangling differences. Grrr... BTW: I was the one who screwed this up and had the compiler set differently in the two projects. One spoonicide, coming up.Software Zen:
delete this;
-
Bah, for a second I thought this was about real software. Then I saw the name "MelonCard", and "techcrunch" and realized they were talking about web "programming". Next they be crying about the time they found out that they were not in fact visited by magical flying unicorns.
¡El diablo está en mis pantalones! ¡Mire, mire! SELECT * FROM User WHERE Clue > 0 0 rows returned Save an Orange - Use the VCF! Personal 3D projects Just Say No to Web 2 Point Blow
-
Joe Woodbury wrote:
Except the problem here really wasn't with 'var' but with scope and scope can get you in most languages.
But clearly for what the author said, he didn't have the intention to declare a global variable and that he had forgotten to use the var keyword. Combining that with JS behavior generated the problem. But again, the root of the problem is that javascript does not enforce variable declaration, creating a big potential for bad habits.
Joe Woodbury wrote:
Of course, if you want to start a list of what's wrong with javascript, it's going to get mighty long.
Couldn't agree more. This makes even more important having good habits.
"To alcohol! The cause of, and solution to, all of life's problems" - Homer Simpson
-
Yep! Annoying that, isn't it? As a result, I always used brackets - which made the code messy - until I got well into C# where they aren't so necessary. So I stopped. :doh: That's also why I always use redundant brackets for
if
andfor
constructs:if (condition)
{
statement;
}Because in the old days (pre auto formatting)
if (condition)
statement;
statement;Looked too much like it executed both on the condition...
Ideological Purity is no substitute for being able to stick your thumb down a pipe to stop the water
I really don't like indented braces. Just saying.
-
I was sooooo happy when compilers started giving a warning on
if (condition);
{
statement;
}:laugh:
Ideological Purity is no substitute for being able to stick your thumb down a pipe to stop the water
I look forward to the day when compilers give a warning because you've indented the braces... ;)
-
I look forward to the day when compilers give a warning because you've indented the braces... ;)
:laugh: Tough!
Ideological Purity is no substitute for being able to stick your thumb down a pipe to stop the water
-
Strict languages help a lot - if you listen to the warnings they give. (Personally I run with "treat warnings as errors" set on). For example:
if(myInt & 0x80 != 0)
in C# is an error - the result is not bool. In C++ it just doesn't do what you expect... This one caught me in an answer in Q&A because I had got so used to C# telling me off that I assumed the operator precedence was correct when C++ didn't give me a warning or error...Ideological Purity is no substitute for being able to stick your thumb down a pipe to stop the water
-
OriginalGriff wrote:
if(myInt & 0x80 != 0)
When I first saw this I was tempted to reply with a: "Whats wrong with this on C#?" Since I'm not used to perform bitwise ANDs on my dayly work I didn't realize that comparison operators had precedence over bitwise operators. I guess in C++ the result would be false (0) if myInt is different from 1?
"To alcohol! The cause of, and solution to, all of life's problems" - Homer Simpson
-
I don't have a choice, I have to use C, but I do like using the easier languages. You can get more done with less pitfalls than with hard languages. But I think you are right. Yo need exposure to the hard ones to know how the machine realy works. That way you write better code in easy languages.
============================== Nothing to say.
I think that's a different issue. C++, used as a better C, issues a lot more type warnings, forcing you to check usage of types and catching certain classes of errors at compile time. A language can be low- or high-level, but still enforce type-checking to various degrees. Static typing can help catch a whole class of errors before running the program.
-
So, today I saw this on CP's daily news: How One Missing `var` Ruined our Launch[^] There's always been endless discussions that it's not about the language, but about the programmer when bad practices arise. Like endless flaming wars against languages VB and JS. Now, to add some more fuel to the debate I'd like to ask: Why some mistake like that happened? It was a simple mistake with serious consequences. The first thing that came to my mind was that, I'd never make such a mistake (ok, maybe 0.00000001% of chance to make it). To me it's like driving without seat-belts. There's no way I'd do it because Brazilian traffic laws are so strict about it that I got very used to it, the same way I'm used to wear clothes when I leave home. Now, by programming on strict languages for the most part of my professional life, declaring variables became the same way, it's just natural, it would feels wrong and just the thought of it bothers me. Would it be the same if I got used developing on more forgiving languages like JS? Do forgiving languages make us more sloppy? I think the forgiving languages may shape us to acquire bad habits. That's one of the reasons I favor C# to VB.Net. There were countless times that I heard: "Bad code can be written in any language". But that's not the point. I'm sure that if I had done more work in more forgiving languages I'd be much more likely to commit mistakes like that. That's why I favor strict languages more. They help you develop good habits that are independent of the fact of a good or bad programmer. Thoughts?
"To alcohol! The cause of, and solution to, all of life's problems" - Homer Simpson
Your point is why I favor the teaching of strict prescriptive grammar in primary schools...yet I find that I prefer unmanaged C++ to managed (and more restrictive) C#.
There's no doubt that strict prescription results in fewer errors of the type you described. But all things have their price, and the price of strict prescription in software is that now and then you'll need to work around the language's constraints to accomplish what you need to accomplish. It's a tradeoff, and like most such, there's no way to answer the question "Should I use this language?" without measuring its characteristics against the demands of the application you need to write.
A good engineer, like a good mechanic, always selects the right tool for the job he faces. Some languages are better for speed; others for safety; others for the support they provide for specific application needs. Few of us can afford to restrict ourselves to just one of them.
-
Your point is why I favor the teaching of strict prescriptive grammar in primary schools...yet I find that I prefer unmanaged C++ to managed (and more restrictive) C#.
There's no doubt that strict prescription results in fewer errors of the type you described. But all things have their price, and the price of strict prescription in software is that now and then you'll need to work around the language's constraints to accomplish what you need to accomplish. It's a tradeoff, and like most such, there's no way to answer the question "Should I use this language?" without measuring its characteristics against the demands of the application you need to write.
A good engineer, like a good mechanic, always selects the right tool for the job he faces. Some languages are better for speed; others for safety; others for the support they provide for specific application needs. Few of us can afford to restrict ourselves to just one of them.
Fran Porretto wrote:
yet I find that I prefer unmanaged C++ to managed (and more restrictive) C#.
I actually agree with you that C++ is better for teaching newcomers because it allows the students to have a grasp on memory allocation and management. They get to understand a little more about the heap and the stack while newcomers that go directly to a managed language will have no idea on how those work.
Fran Porretto wrote:
A good engineer, like a good mechanic, always selects the right tool for the job he faces
Agree, but isn't really the point I was trying to make. The point I was trying to make is whether or not one should favor strict languages to develop good habits and avoid bad ones.
"To alcohol! The cause of, and solution to, all of life's problems" - Homer Simpson
-
I really like the use of the word "Tolerate" in the settings on that page. It's like they are saying "[setting] is a bad idea, but if you want we have to we can tolerate it." It's just such a better word than "Allow" for crap like that.
-
I really like the use of the word "Tolerate" in the settings on that page. It's like they are saying "[setting] is a bad idea, but if you want we have to we can tolerate it." It's just such a better word than "Allow" for crap like that.
:laugh: At least there's something for those that are willing.
"To alcohol! The cause of, and solution to, all of life's problems" - Homer Simpson
-
true :)
"To alcohol! The cause of, and solution to, all of life's problems" - Homer Simpson
-
So, today I saw this on CP's daily news: How One Missing `var` Ruined our Launch[^] There's always been endless discussions that it's not about the language, but about the programmer when bad practices arise. Like endless flaming wars against languages VB and JS. Now, to add some more fuel to the debate I'd like to ask: Why some mistake like that happened? It was a simple mistake with serious consequences. The first thing that came to my mind was that, I'd never make such a mistake (ok, maybe 0.00000001% of chance to make it). To me it's like driving without seat-belts. There's no way I'd do it because Brazilian traffic laws are so strict about it that I got very used to it, the same way I'm used to wear clothes when I leave home. Now, by programming on strict languages for the most part of my professional life, declaring variables became the same way, it's just natural, it would feels wrong and just the thought of it bothers me. Would it be the same if I got used developing on more forgiving languages like JS? Do forgiving languages make us more sloppy? I think the forgiving languages may shape us to acquire bad habits. That's one of the reasons I favor C# to VB.Net. There were countless times that I heard: "Bad code can be written in any language". But that's not the point. I'm sure that if I had done more work in more forgiving languages I'd be much more likely to commit mistakes like that. That's why I favor strict languages more. They help you develop good habits that are independent of the fact of a good or bad programmer. Thoughts?
"To alcohol! The cause of, and solution to, all of life's problems" - Homer Simpson
You ain't seen nothing until you go cruising through some of the HTML I now maintain. I never realized how sloppy you could get with it. Missing end tags is one thing, but some screens don't even bother to begin with . If I had known years ago when I first started doing web applications that I could have three or four statements in the same document, I wouldn't have had near the coding hoops to jump through when Corporate forced some headers and footers on us that consultants had put together without bothering to consult with the programmers. I certainly believe the languages should force some constraints on you. Not to the degree of Pascal's straitjacket worn willingly, but variable declarations and loops that give you definite structure. I absolutely hate C's (and all its variants) FOR statement. At least in BASIC you know what is the control variable and its limits. I once coded a C FOR statement such that the variables being set, tested, and incremented, had nothing to do with the loop. I was totally disgusted that the language let me get away with it because I'd worked with enough idiots who would do something like that in their code thinking they were clever. I've spent most of my career cleaning up the crap code left by others, that I don't need to work in languages that allow you to do stupid stuff either by design or omission. Self discipline is generally not enough to write good code.
Psychosis at 10 Film at 11 Those who do not remember the past, are doomed to repeat it. Those who do not remember the past, cannot build upon it.
-
So, today I saw this on CP's daily news: How One Missing `var` Ruined our Launch[^] There's always been endless discussions that it's not about the language, but about the programmer when bad practices arise. Like endless flaming wars against languages VB and JS. Now, to add some more fuel to the debate I'd like to ask: Why some mistake like that happened? It was a simple mistake with serious consequences. The first thing that came to my mind was that, I'd never make such a mistake (ok, maybe 0.00000001% of chance to make it). To me it's like driving without seat-belts. There's no way I'd do it because Brazilian traffic laws are so strict about it that I got very used to it, the same way I'm used to wear clothes when I leave home. Now, by programming on strict languages for the most part of my professional life, declaring variables became the same way, it's just natural, it would feels wrong and just the thought of it bothers me. Would it be the same if I got used developing on more forgiving languages like JS? Do forgiving languages make us more sloppy? I think the forgiving languages may shape us to acquire bad habits. That's one of the reasons I favor C# to VB.Net. There were countless times that I heard: "Bad code can be written in any language". But that's not the point. I'm sure that if I had done more work in more forgiving languages I'd be much more likely to commit mistakes like that. That's why I favor strict languages more. They help you develop good habits that are independent of the fact of a good or bad programmer. Thoughts?
"To alcohol! The cause of, and solution to, all of life's problems" - Homer Simpson
I think it's a little bit of the chicken-and-the-egg problem. Forgiving languages are (IMO) geared toward a specific audience: people who want to develop software, but perhaps aren't as tehcnically minded, as interested and/or, to be frank, as thorough as other developers. Thus, the people who use those languages tend to be sloppier developers, and make more mistakes. This is not to say that there aren't very good developers using the more 'forgiving' languages, and very poor ones using the 'less'. However, for anyone to suggest that, as a class, VB developers are better than C# developers, well, I don't think that stands up under scrutiny. There are certainly overlaps in those two populations though.
-
So, today I saw this on CP's daily news: How One Missing `var` Ruined our Launch[^] There's always been endless discussions that it's not about the language, but about the programmer when bad practices arise. Like endless flaming wars against languages VB and JS. Now, to add some more fuel to the debate I'd like to ask: Why some mistake like that happened? It was a simple mistake with serious consequences. The first thing that came to my mind was that, I'd never make such a mistake (ok, maybe 0.00000001% of chance to make it). To me it's like driving without seat-belts. There's no way I'd do it because Brazilian traffic laws are so strict about it that I got very used to it, the same way I'm used to wear clothes when I leave home. Now, by programming on strict languages for the most part of my professional life, declaring variables became the same way, it's just natural, it would feels wrong and just the thought of it bothers me. Would it be the same if I got used developing on more forgiving languages like JS? Do forgiving languages make us more sloppy? I think the forgiving languages may shape us to acquire bad habits. That's one of the reasons I favor C# to VB.Net. There were countless times that I heard: "Bad code can be written in any language". But that's not the point. I'm sure that if I had done more work in more forgiving languages I'd be much more likely to commit mistakes like that. That's why I favor strict languages more. They help you develop good habits that are independent of the fact of a good or bad programmer. Thoughts?
"To alcohol! The cause of, and solution to, all of life's problems" - Homer Simpson
You're point about VB is only valid with "Option Strict Off". You need to use "Option Strict On". You'll find its a lot less forgiving than you put on. You may even find that it is less forgiving than C# in some ways (it forces you to be more explicit about what your intentions are, no "var" keyword, etc). I recommend that all VB developers use Option Strict. It will save you loads of headaches when you actually take advantage of static type checking.
-
You're point about VB is only valid with "Option Strict Off". You need to use "Option Strict On". You'll find its a lot less forgiving than you put on. You may even find that it is less forgiving than C# in some ways (it forces you to be more explicit about what your intentions are, no "var" keyword, etc). I recommend that all VB developers use Option Strict. It will save you loads of headaches when you actually take advantage of static type checking.
Vaughn Bigham wrote:
You're point about VB is only valid with "Option Strict Off".
It defintelly helps, but it's not all. Even with that option on, VB still performs a lot of implicit conversions on uncompatible types. Suddenly you can apply the equals operator in lots of places you shouldn't. And I think that's pretty dangerous.
"To alcohol! The cause of, and solution to, all of life's problems" - Homer Simpson
-
So, today I saw this on CP's daily news: How One Missing `var` Ruined our Launch[^] There's always been endless discussions that it's not about the language, but about the programmer when bad practices arise. Like endless flaming wars against languages VB and JS. Now, to add some more fuel to the debate I'd like to ask: Why some mistake like that happened? It was a simple mistake with serious consequences. The first thing that came to my mind was that, I'd never make such a mistake (ok, maybe 0.00000001% of chance to make it). To me it's like driving without seat-belts. There's no way I'd do it because Brazilian traffic laws are so strict about it that I got very used to it, the same way I'm used to wear clothes when I leave home. Now, by programming on strict languages for the most part of my professional life, declaring variables became the same way, it's just natural, it would feels wrong and just the thought of it bothers me. Would it be the same if I got used developing on more forgiving languages like JS? Do forgiving languages make us more sloppy? I think the forgiving languages may shape us to acquire bad habits. That's one of the reasons I favor C# to VB.Net. There were countless times that I heard: "Bad code can be written in any language". But that's not the point. I'm sure that if I had done more work in more forgiving languages I'd be much more likely to commit mistakes like that. That's why I favor strict languages more. They help you develop good habits that are independent of the fact of a good or bad programmer. Thoughts?
"To alcohol! The cause of, and solution to, all of life's problems" - Homer Simpson
Hello Fabio, I agree with you 100%, I think strictness in the language is something very desirable, especially considering that many of us coders tend to have some very intense periods, where maybe you get to sleep a couple of hours a day if you're lucky. I know there are gifted people that thrive in such intense pressure periods, but at least in my case, with sleep depravation, I tend to see a dramatic rise in my mistakes, and I certainly like to know the compiler is helping me a bit with the error count. :P Regarding the Meloncard article, I think the developer is cutting himself some slack. The mistake may appear to be subtle indeed, but I strongly believe it's a mistake that would have shown up in a simple two-concurrent user test. Maybe I'm wrong, but it's not the first time where I see a contrived technical-jargon full excuse that attempts to divert direct responsibility over an issue. Cheers! =)