What is the longest programming misconception you've held (that you are aware of)?
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
That other programmers knew what they were doing :sigh:
Best, Sander sanderrossel.com Migrating Applications to the Cloud with Azure arrgh.js - Bringing LINQ to JavaScript Object-Oriented Programming in C# Succinctly
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
Here's an excellent explanation of two's compliment Twos complement: Negative numbers in binary - YouTube[^]
-
Note: The following is a personal statement of preference, not an invitation to a jihad. Not really a programming misconception, but a coding style choice. For a very long time, starting in the mid-1980's through about 2010 or so, I used K&R braces exclusively. When I started writing C#, I used Allman[^] braces, following the style recommended by Microsoft and a couple of the books I was using. As time has gone on Allman has become my preferred style. I have some vision problems due to age and glaucoma, so my code needs frequent blank lines to separate logical blocks. Allman braces provide white space that isn't merely cosmetic. I've even got an editor macro that converts K&R braces to Allman. I have a large body of C++ that I recently converted as part of a refactor and refresh effort on an old product that I'm maintaining.
Software Zen:
delete this;
-
That other programmers knew what they were doing :sigh:
Best, Sander sanderrossel.com Migrating Applications to the Cloud with Azure arrgh.js - Bringing LINQ to JavaScript Object-Oriented Programming in C# Succinctly
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
That
async
execution does not mean it'sparallel/threaded
and it has varied behavior depending on the runtime.To alcohol! The cause of, and solution to, all of life's problems - Homer Simpson ---- Our heads are round so our thoughts can change direction - Francis Picabia
-
I use Allman for C# but for Javascript(mostly Typescript) is make use of K&R. I do this because it's the generally accepted style for both languages and I am used to swapping between them. It's also because working as part of a small team within a larger group(a team of 5 developers within a group of 20+ developers) it's easier to follow the generally accepted standards, or rather code doesn't get past code review if it doesn't follow those standards. At home I do the same, using Allman for C# and K&R for Typescript or any other form of Javascript - it just kind of 'feels' right.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
I try to do the same, but am working on a Java project that's using alman because the lead dev's brain locks up trying to do any other styles. He wanted to do C# capitalization rules too, but eventually yielded on that part because Android Studio's autocomplete is strictly case sensitive, and having to try and remember different casing styles for our code vs android code was blowing up everyone else's brains.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason? Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful? --Zachris Topelius Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies. -- Sarah Hoyt
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
The longest programming misconception I've ever held is that JavaScript (ECMAScript) is "bad". It took me a long time to see JavaScript as (just another "assembly language") but I've finally made "peace" with JavaScript. I think...
-
You don't need to fade it by saying it's personal preference when it's the Correct™️ way. Bring the jihad! :laugh: My rationale is that other coding styles often waste horizontal space but use vertical space miserly. The control statement before the
{
needs to stand out so that you don't have to squint to read its condition. It also aligns the{
…}
and reduces the number of broken lines, which is another thing I try to avoid (hence 3-space indentation instead of 4 or even 8, whose users should be forced to edit all their spaces manually.)Robust Services Core | Software Techniques for Lemmings | Articles
The fox knows many things, but the hedgehog knows one big thing.Not an answer to the original question, however, since this part of the thread is dealing with style/formatting... Thankful that VS now supports the .editorconfig specification. I prefer and use 2 space indentation, as to avoid scrolling left and right to be able to read my code. I also use { and } on their own lines at almost all times, the primary exceptions would be public: inline accessor get methods where exposing the member variable itself would be a bad idea... ie. private: DWORD m_cbAllocated; public: inline DWORD get_Allocated() const { return m_cbAllocated; } In those cases, I find that breaking the method down into multiple lines is overkill. I also like white spaces after ( and before ) as long as it is not an empty construct. It simply makes it easier for me to read. ie. if( ERROR_SUCCESS == ( lRet = RegOpenKeyExA( HKEY_LOCAL_MACHINE, rPF.sPath(), 0, KEY_READ, &hKey ) ) ) In comparisons to constants, I always like to have the constant on the left (see above). This serves two purposes... It is easier to see what I am comparing to without scrolling right past all parameters, and it ensures that a typo (say a missing '=' sign), doesn't compile if comparing to an lValue. (resulting in a bug) ie. LONG lRet = RegOpenKeyExA( HKEY_LOCAL_MACHINE, rPF.sPath(), 0, KEY_READ, &hKey ); if( ERROR_SUCCESS = lRet ) { ... } Not for everyone, but after doing this for 30+ years, it's what I'm used to, and no employer in their right mind is going to force me to change this late in the game. Here's the complimentary grain of salt .
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
That foreach loops in PHP scope and clean up their memory references. Spoiler, they do not! I fixed quite a few long outstanding bugs with unset() calls when I figured this out. I know its in the documentation but I bet quite a few of us don't RTFM on the looping structures.
-
In a trinary-sign computer, there would be a much bigger difference between logical and arithmetic operations. One way to do so would be to enforce that only non-negative values may be used in logical operations. A better solution IMO would be to ignore positive or negative signs, performing the logical operation only on the magnitudes. A zero sign would indicate that the magnitude must be "normalized" to zero before performing the operation. The result of the operation would either have a positive sign (if non-zero) or a zero sign (if zero). I leave the design of the hardware as an exercise to our hardware colleagues... :)
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.
-
Note: The following is a personal statement of preference, not an invitation to a jihad. Not really a programming misconception, but a coding style choice. For a very long time, starting in the mid-1980's through about 2010 or so, I used K&R braces exclusively. When I started writing C#, I used Allman[^] braces, following the style recommended by Microsoft and a couple of the books I was using. As time has gone on Allman has become my preferred style. I have some vision problems due to age and glaucoma, so my code needs frequent blank lines to separate logical blocks. Allman braces provide white space that isn't merely cosmetic. I've even got an editor macro that converts K&R braces to Allman. I have a large body of C++ that I recently converted as part of a refactor and refresh effort on an old product that I'm maintaining.
Software Zen:
delete this;
I always thought of the Allman style as "readable" style as opposed to "space-saving publishing" style. Matching braces always allowed me quicker reading of where blocks began and ended. I never knew Allman existed as I programmed Macs and other PCs.
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
For 30 years, I have thought readability was the "truth". Then I bought a book called "Clean Code" by Robert C. Martin and found clean code was this convoluted set of rules of hard to read code that made no sense. I am now aware that "readability" means "machine readability".
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
That functional programming might be useful for anything. At first it seemed to me that, due to its recursive nature, it might be good for writing compilers. This was a misconception I held for nearly two minutes, which, as any GPU will tell you, is an eternity!
-
The Russian Setun computer (1958) was a base 3 computer, but I have no idea how it represented negative numbers
Ger2001 wrote:
The Russian Setun computer (1958) was a base 3 computer
Ah, so it could represent the thesis, the anti-thesis, and the synthesis in a single digit. :D My proposal was for a CPU that has a signed-magnitude representation, but with a sign indicator that has three possible states - positive, zero, and negative. The magnitude might be in binary or any other convenient base. To my knowledge, this has never been tried, presumably because the hardware would be more complex than the currently popular twos-complement implementation. OTOH, my proposal would eliminate the anomalies of "signed zero" and of a negative range larger than the positive range. It would also be more consistent - unary minus would operate properly on all numbers in the range, which does not apply to the minimum value in a twos-complement implementation.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.
-
That functional programming might be useful for anything. At first it seemed to me that, due to its recursive nature, it might be good for writing compilers. This was a misconception I held for nearly two minutes, which, as any GPU will tell you, is an eternity!
Functional programming has fascinated me! But not enough to ever write a single line of code. Every time I come across one of those sites saying FP is going to solve all the problems that exist in programming I read it with curiosity. But I've never seen a single site that delves into the guts of what it would take to do a significant program, such as a word processor. Everything I've read indicates (between the lines) that when tackling such a problem FP would actually get in the way of accomplishing the goal. With their newer tree structure use it may be possible, but it still seems like a complete pain, and a memory-intensive hog. If anyone ever comes across a 'create a word processor with FP' site let me know!
-
Functional programming has fascinated me! But not enough to ever write a single line of code. Every time I come across one of those sites saying FP is going to solve all the problems that exist in programming I read it with curiosity. But I've never seen a single site that delves into the guts of what it would take to do a significant program, such as a word processor. Everything I've read indicates (between the lines) that when tackling such a problem FP would actually get in the way of accomplishing the goal. With their newer tree structure use it may be possible, but it still seems like a complete pain, and a memory-intensive hog. If anyone ever comes across a 'create a word processor with FP' site let me know!
Yeah. I actually tried it to see what it would be like... the answer is that it's effectively a thought experiment - nothing much more than that. With something like F#, you can also pretty much break the rules and start writing C# with different syntax, and in effect, that's what you end up doing: the functional thing ends up becoming nothing much more than a more cumbersome way to declare fairly standard methods. I don't think it could ever be used to write actual systems - such as your word processor, and every time I've tried to get someone who evangelizes this stuff to explain how it would be done, they just get angry. So I win!
-
Yeah. I actually tried it to see what it would be like... the answer is that it's effectively a thought experiment - nothing much more than that. With something like F#, you can also pretty much break the rules and start writing C# with different syntax, and in effect, that's what you end up doing: the functional thing ends up becoming nothing much more than a more cumbersome way to declare fairly standard methods. I don't think it could ever be used to write actual systems - such as your word processor, and every time I've tried to get someone who evangelizes this stuff to explain how it would be done, they just get angry. So I win!
Thanks for sharing your experience. For a while I was confused, because I heard that full systems were made in lisp, which my reading made me think was functional. A few months ago I actually dug deeper into that and found that lisp isn't an 'only-functional' language, and my 'A-Ha!' light turned on! So I'm still waiting for someone to show me my word processor example! If I'm not mistaken, another disadvantage of doing one fully functional is there is no way to really organize and see the code as you can with OO. Functions everywhere! But maybe I don't understand it enough. It just seems like a pain in the ass.
-
You've got 1's complement as well, which I believe was far more common in the '60s and '70 than sign-magnitude. Wasn't the Univac 1100 series all 1's complement? Some CDC mainframes as well, I believe. I believe that you have to go back to designs from the '50s to find sign-magnitude integer representation. For floating point, I have never seen anything but sign-magnitude, though.
The Univac 1100 was indeed 1's complement. As was its mid-80s successor the 2200.