What is the longest programming misconception you've held (that you are aware of)?
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
IBM 7090 apparently had that. For me I think it was the idea that local variables are created one by one, at the moment they are declared, leading to silly conclusions such as "obviously you should reuse an existing variable instead of making a new one". Maybe that's a thing in scripting languages?
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
My answer changed today because of the above post[^]. :-D I was about to reply to it saying that, unlike C++, it's interesting that C# doesn't insist that
default
be the last label in aswitch
statement. But I figured I should check this, and it turns out that C++ also allows it! I'd always believed otherwise since starting to use C++ about 20 years ago, perhaps because that's the way it is in the language I used for a long time, though it usesOUT
instead ofdefault
. EDIT: That's the longest known misconception; there are probably tons of others!Robust Services Core | Software Techniques for Lemmings | Articles
The fox knows many things, but the hedgehog knows one big thing. -
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
Note: The following is a personal statement of preference, not an invitation to a jihad. Not really a programming misconception, but a coding style choice. For a very long time, starting in the mid-1980's through about 2010 or so, I used K&R braces exclusively. When I started writing C#, I used Allman[^] braces, following the style recommended by Microsoft and a couple of the books I was using. As time has gone on Allman has become my preferred style. I have some vision problems due to age and glaucoma, so my code needs frequent blank lines to separate logical blocks. Allman braces provide white space that isn't merely cosmetic. I've even got an editor macro that converts K&R braces to Allman. I have a large body of C++ that I recently converted as part of a refactor and refresh effort on an old product that I'm maintaining.
Software Zen:
delete this;
-
Note: The following is a personal statement of preference, not an invitation to a jihad. Not really a programming misconception, but a coding style choice. For a very long time, starting in the mid-1980's through about 2010 or so, I used K&R braces exclusively. When I started writing C#, I used Allman[^] braces, following the style recommended by Microsoft and a couple of the books I was using. As time has gone on Allman has become my preferred style. I have some vision problems due to age and glaucoma, so my code needs frequent blank lines to separate logical blocks. Allman braces provide white space that isn't merely cosmetic. I've even got an editor macro that converts K&R braces to Allman. I have a large body of C++ that I recently converted as part of a refactor and refresh effort on an old product that I'm maintaining.
Software Zen:
delete this;
You don't need to fade it by saying it's personal preference when it's the Correct™️ way. Bring the jihad! :laugh: My rationale is that other coding styles often waste horizontal space but use vertical space miserly. The control statement before the
{
needs to stand out so that you don't have to squint to read its condition. It also aligns the{
…}
and reduces the number of broken lines, which is another thing I try to avoid (hence 3-space indentation instead of 4 or even 8, whose users should be forced to edit all their spaces manually.)Robust Services Core | Software Techniques for Lemmings | Articles
The fox knows many things, but the hedgehog knows one big thing. -
Note: The following is a personal statement of preference, not an invitation to a jihad. Not really a programming misconception, but a coding style choice. For a very long time, starting in the mid-1980's through about 2010 or so, I used K&R braces exclusively. When I started writing C#, I used Allman[^] braces, following the style recommended by Microsoft and a couple of the books I was using. As time has gone on Allman has become my preferred style. I have some vision problems due to age and glaucoma, so my code needs frequent blank lines to separate logical blocks. Allman braces provide white space that isn't merely cosmetic. I've even got an editor macro that converts K&R braces to Allman. I have a large body of C++ that I recently converted as part of a refactor and refresh effort on an old product that I'm maintaining.
Software Zen:
delete this;
Visual Studio has quite extensive options for code reformatting according to your preferred style. The preferences are given by the logged in user. You may want to set up two user names, with different formatting preferences: Log in with one name, go to the end brace, delete it and retype it, and you have the code the way you want it. Log in with the other name, do the same exercise, and code is formatted the way it should be delivered to others.
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
I'm unaware that I suffer from any. :~ But, related to "negative integers", one misconception which I have seen at least one person state is the idea that signed integers (twos complement) are lower level (more native to the hardware) than unsigned integers -- that the CPU has to work harder to perform unsigned math. I'm pretty sure that I saw someone state that you should avoid using unsigned integers because they're slower! :omg:
-
Note: The following is a personal statement of preference, not an invitation to a jihad. Not really a programming misconception, but a coding style choice. For a very long time, starting in the mid-1980's through about 2010 or so, I used K&R braces exclusively. When I started writing C#, I used Allman[^] braces, following the style recommended by Microsoft and a couple of the books I was using. As time has gone on Allman has become my preferred style. I have some vision problems due to age and glaucoma, so my code needs frequent blank lines to separate logical blocks. Allman braces provide white space that isn't merely cosmetic. I've even got an editor macro that converts K&R braces to Allman. I have a large body of C++ that I recently converted as part of a refactor and refresh effort on an old product that I'm maintaining.
Software Zen:
delete this;
+5 for Allman.
-
You don't need to fade it by saying it's personal preference when it's the Correct™️ way. Bring the jihad! :laugh: My rationale is that other coding styles often waste horizontal space but use vertical space miserly. The control statement before the
{
needs to stand out so that you don't have to squint to read its condition. It also aligns the{
…}
and reduces the number of broken lines, which is another thing I try to avoid (hence 3-space indentation instead of 4 or even 8, whose users should be forced to edit all their spaces manually.)Robust Services Core | Software Techniques for Lemmings | Articles
The fox knows many things, but the hedgehog knows one big thing.Greg Utas wrote:
edit all their spaces manually
As on a VT100, with an eighty-character limit.
-
Greg Utas wrote:
edit all their spaces manually
As on a VT100, with an eighty-character limit.
The arrival of VT100s in our university computing lab was momentous! Our DECwriters were then used mostly for printouts.
Robust Services Core | Software Techniques for Lemmings | Articles
The fox knows many things, but the hedgehog knows one big thing. -
Note: The following is a personal statement of preference, not an invitation to a jihad. Not really a programming misconception, but a coding style choice. For a very long time, starting in the mid-1980's through about 2010 or so, I used K&R braces exclusively. When I started writing C#, I used Allman[^] braces, following the style recommended by Microsoft and a couple of the books I was using. As time has gone on Allman has become my preferred style. I have some vision problems due to age and glaucoma, so my code needs frequent blank lines to separate logical blocks. Allman braces provide white space that isn't merely cosmetic. I've even got an editor macro that converts K&R braces to Allman. I have a large body of C++ that I recently converted as part of a refactor and refresh effort on an old product that I'm maintaining.
Software Zen:
delete this;
I use Allman for C# but for Javascript(mostly Typescript) is make use of K&R. I do this because it's the generally accepted style for both languages and I am used to swapping between them. It's also because working as part of a small team within a larger group(a team of 5 developers within a group of 20+ developers) it's easier to follow the generally accepted standards, or rather code doesn't get past code review if it doesn't follow those standards. At home I do the same, using Allman for C# and K&R for Typescript or any other form of Javascript - it just kind of 'feels' right.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
-
That depends, some systems have a "negative space" that is one larger than the positive space (or consider 0 to be a positive number, which is also an odd idea) We'd need to move away from binary computers to sort all this crap out! Can I suggest trinary? "True", "False", and "Dunno"? :laugh:
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!
Save us all our pain: just use a unary system of 'Dunno'! All our problems would be solved, and none of them could be! How very Schrödinger-ish! :laugh:
-
My answer changed today because of the above post[^]. :-D I was about to reply to it saying that, unlike C++, it's interesting that C# doesn't insist that
default
be the last label in aswitch
statement. But I figured I should check this, and it turns out that C++ also allows it! I'd always believed otherwise since starting to use C++ about 20 years ago, perhaps because that's the way it is in the language I used for a long time, though it usesOUT
instead ofdefault
. EDIT: That's the longest known misconception; there are probably tons of others!Robust Services Core | Software Techniques for Lemmings | Articles
The fox knows many things, but the hedgehog knows one big thing.I did not know C++ allowed
default
anywhere except the end, either, until your post. So that is my newest longest-running programming misconception! :laugh: -
Note: The following is a personal statement of preference, not an invitation to a jihad. Not really a programming misconception, but a coding style choice. For a very long time, starting in the mid-1980's through about 2010 or so, I used K&R braces exclusively. When I started writing C#, I used Allman[^] braces, following the style recommended by Microsoft and a couple of the books I was using. As time has gone on Allman has become my preferred style. I have some vision problems due to age and glaucoma, so my code needs frequent blank lines to separate logical blocks. Allman braces provide white space that isn't merely cosmetic. I've even got an editor macro that converts K&R braces to Allman. I have a large body of C++ that I recently converted as part of a refactor and refresh effort on an old product that I'm maintaining.
Software Zen:
delete this;
I prefer Ratliff style, but can see why you would like Allman if your eyesight was failing.
-
The arrival of VT100s in our university computing lab was momentous! Our DECwriters were then used mostly for printouts.
Robust Services Core | Software Techniques for Lemmings | Articles
The fox knows many things, but the hedgehog knows one big thing.When I started learning to program on the high school's PDP-11 in 1983, the lab had a mix of VT52s, one VT100, and a couple of Wyse VT100 clones. And one DECwriter "hard-copy terminal" we had to use when we needed to print out our work.
-
I'm unaware that I suffer from any. :~ But, related to "negative integers", one misconception which I have seen at least one person state is the idea that signed integers (twos complement) are lower level (more native to the hardware) than unsigned integers -- that the CPU has to work harder to perform unsigned math. I'm pretty sure that I saw someone state that you should avoid using unsigned integers because they're slower! :omg:
Yes, I don't think that applies for most CPUs however, it is fairly well known that in programming GPUs with CUDA it is much better to use signed integers than unsigned because the overflow handling is much faster. Nvidia's GPU cores are known for having many odd limitations.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
-
I did not know C++ allowed
default
anywhere except the end, either, until your post. So that is my newest longest-running programming misconception! :laugh:While not explicitly aware of it, my understanding of the switch statement is that all the case declarations (which includes default) were merely labels, so I am not surprised you can stick the default anywhere in the list, I've just never seen it done before. Kinda like the idea though.
-
That depends, some systems have a "negative space" that is one larger than the positive space (or consider 0 to be a positive number, which is also an odd idea) We'd need to move away from binary computers to sort all this crap out! Can I suggest trinary? "True", "False", and "Dunno"? :laugh:
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!
OriginalGriff wrote:
Can I suggest trinary? "True", "False", and "Dunno" "Null"?
As in SQL null, not C# null
Wrong is evil and must be defeated. - Jeff Ello Never stop dreaming - Freddie Kruger
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
1. That OO is a good idea. 2. That exceptions are a good way to handle errors.
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
That other programmers knew what they were doing :sigh:
Best, Sander sanderrossel.com Migrating Applications to the Cloud with Azure arrgh.js - Bringing LINQ to JavaScript Object-Oriented Programming in C# Succinctly
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
Here's an excellent explanation of two's compliment Twos complement: Negative numbers in binary - YouTube[^]