What is the longest programming misconception you've held (that you are aware of)?
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
That foreach loops in PHP scope and clean up their memory references. Spoiler, they do not! I fixed quite a few long outstanding bugs with unset() calls when I figured this out. I know its in the documentation but I bet quite a few of us don't RTFM on the looping structures.
-
In a trinary-sign computer, there would be a much bigger difference between logical and arithmetic operations. One way to do so would be to enforce that only non-negative values may be used in logical operations. A better solution IMO would be to ignore positive or negative signs, performing the logical operation only on the magnitudes. A zero sign would indicate that the magnitude must be "normalized" to zero before performing the operation. The result of the operation would either have a positive sign (if non-zero) or a zero sign (if zero). I leave the design of the hardware as an exercise to our hardware colleagues... :)
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.
-
Note: The following is a personal statement of preference, not an invitation to a jihad. Not really a programming misconception, but a coding style choice. For a very long time, starting in the mid-1980's through about 2010 or so, I used K&R braces exclusively. When I started writing C#, I used Allman[^] braces, following the style recommended by Microsoft and a couple of the books I was using. As time has gone on Allman has become my preferred style. I have some vision problems due to age and glaucoma, so my code needs frequent blank lines to separate logical blocks. Allman braces provide white space that isn't merely cosmetic. I've even got an editor macro that converts K&R braces to Allman. I have a large body of C++ that I recently converted as part of a refactor and refresh effort on an old product that I'm maintaining.
Software Zen:
delete this;
I always thought of the Allman style as "readable" style as opposed to "space-saving publishing" style. Matching braces always allowed me quicker reading of where blocks began and ended. I never knew Allman existed as I programmed Macs and other PCs.
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
For 30 years, I have thought readability was the "truth". Then I bought a book called "Clean Code" by Robert C. Martin and found clean code was this convoluted set of rules of hard to read code that made no sense. I am now aware that "readability" means "machine readability".
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
That functional programming might be useful for anything. At first it seemed to me that, due to its recursive nature, it might be good for writing compilers. This was a misconception I held for nearly two minutes, which, as any GPU will tell you, is an eternity!
-
The Russian Setun computer (1958) was a base 3 computer, but I have no idea how it represented negative numbers
Ger2001 wrote:
The Russian Setun computer (1958) was a base 3 computer
Ah, so it could represent the thesis, the anti-thesis, and the synthesis in a single digit. :D My proposal was for a CPU that has a signed-magnitude representation, but with a sign indicator that has three possible states - positive, zero, and negative. The magnitude might be in binary or any other convenient base. To my knowledge, this has never been tried, presumably because the hardware would be more complex than the currently popular twos-complement implementation. OTOH, my proposal would eliminate the anomalies of "signed zero" and of a negative range larger than the positive range. It would also be more consistent - unary minus would operate properly on all numbers in the range, which does not apply to the minimum value in a twos-complement implementation.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.
-
That functional programming might be useful for anything. At first it seemed to me that, due to its recursive nature, it might be good for writing compilers. This was a misconception I held for nearly two minutes, which, as any GPU will tell you, is an eternity!
Functional programming has fascinated me! But not enough to ever write a single line of code. Every time I come across one of those sites saying FP is going to solve all the problems that exist in programming I read it with curiosity. But I've never seen a single site that delves into the guts of what it would take to do a significant program, such as a word processor. Everything I've read indicates (between the lines) that when tackling such a problem FP would actually get in the way of accomplishing the goal. With their newer tree structure use it may be possible, but it still seems like a complete pain, and a memory-intensive hog. If anyone ever comes across a 'create a word processor with FP' site let me know!
-
Functional programming has fascinated me! But not enough to ever write a single line of code. Every time I come across one of those sites saying FP is going to solve all the problems that exist in programming I read it with curiosity. But I've never seen a single site that delves into the guts of what it would take to do a significant program, such as a word processor. Everything I've read indicates (between the lines) that when tackling such a problem FP would actually get in the way of accomplishing the goal. With their newer tree structure use it may be possible, but it still seems like a complete pain, and a memory-intensive hog. If anyone ever comes across a 'create a word processor with FP' site let me know!
Yeah. I actually tried it to see what it would be like... the answer is that it's effectively a thought experiment - nothing much more than that. With something like F#, you can also pretty much break the rules and start writing C# with different syntax, and in effect, that's what you end up doing: the functional thing ends up becoming nothing much more than a more cumbersome way to declare fairly standard methods. I don't think it could ever be used to write actual systems - such as your word processor, and every time I've tried to get someone who evangelizes this stuff to explain how it would be done, they just get angry. So I win!
-
Yeah. I actually tried it to see what it would be like... the answer is that it's effectively a thought experiment - nothing much more than that. With something like F#, you can also pretty much break the rules and start writing C# with different syntax, and in effect, that's what you end up doing: the functional thing ends up becoming nothing much more than a more cumbersome way to declare fairly standard methods. I don't think it could ever be used to write actual systems - such as your word processor, and every time I've tried to get someone who evangelizes this stuff to explain how it would be done, they just get angry. So I win!
Thanks for sharing your experience. For a while I was confused, because I heard that full systems were made in lisp, which my reading made me think was functional. A few months ago I actually dug deeper into that and found that lisp isn't an 'only-functional' language, and my 'A-Ha!' light turned on! So I'm still waiting for someone to show me my word processor example! If I'm not mistaken, another disadvantage of doing one fully functional is there is no way to really organize and see the code as you can with OO. Functions everywhere! But maybe I don't understand it enough. It just seems like a pain in the ass.
-
You've got 1's complement as well, which I believe was far more common in the '60s and '70 than sign-magnitude. Wasn't the Univac 1100 series all 1's complement? Some CDC mainframes as well, I believe. I believe that you have to go back to designs from the '50s to find sign-magnitude integer representation. For floating point, I have never seen anything but sign-magnitude, though.
The Univac 1100 was indeed 1's complement. As was its mid-80s successor the 2200.
-
For 30 years, I have thought readability was the "truth". Then I bought a book called "Clean Code" by Robert C. Martin and found clean code was this convoluted set of rules of hard to read code that made no sense. I am now aware that "readability" means "machine readability".
For clarity, are you saying that the rules in 'Clean Code' are convoluted and make no sense, or that your earlier definition of 'clean code (readability?)' was nonsensical?
-
For clarity, are you saying that the rules in 'Clean Code' are convoluted and make no sense, or that your earlier definition of 'clean code (readability?)' was nonsensical?
I was implying that "Clean Code" rules were convoluted, but after reading it, I was clearly questioning my own rules as well. The confusion was deliberate!
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
-
Until just now I believed negative integers were just a flip of the first bit. Wow, how wrong I was, for MANY years! Are there some architectures where that is the case, to make myself feel a little better?
Not strictly programming, but regarding HTTPS protocol. For a long while I was convinced that sending sensitive data in the query string of an HTTPS request was insecure, because I believed the URL (including the querystring) was un-encrypted. At the time I was working with a financial services company with many third parties sending data around in pseudo-webservices. I refused to deal with 3rd parties that insisted on passing data in the querystring, and insisted on POSTing it instead. It took a while (many months) before I realised that when using HTTPS protocols, the client and host negotiate first and the actual URL (with querystring) is only sent once keys are exchanged and can therefore be encrypted over the wire. It just seems counter-intuitive that a string I can type into the browser address bar, or into Fiddler, or set as a string for a WebRequest address, would actually get encrypted before sending. Nobody ever actually called me out over it though, and a few companies changed their interfaces as a result. :sigh: :-O
-
I had no idea C actually has (or had, haven't updated since ANSI C) a goto statement in it. (not that it's in anyway sane to use)
I agree. For many years I assumed that C, C++ and C# had no GOTO - then I saw it in some example code somewhere and was shocked! I still didn't use it myself - I've never had the need for it - I was just surprised it existed.
- I would love to change the world, but they won’t give me the source code.
-
I agree. For many years I assumed that C, C++ and C# had no GOTO - then I saw it in some example code somewhere and was shocked! I still didn't use it myself - I've never had the need for it - I was just surprised it existed.
- I would love to change the world, but they won’t give me the source code.
-
There are two ways to represent negative numbers: "sign & magnitude" and "two's-complement" - the "flip the top bit" approach is the former, and was used extensively by IBM until around the 70's bby which time it was clear that two's complement was a "better" solution (i.e. easier to implement in hardware, and didn't have a "negative zero" which is a odd concept all on it's own).
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!