Should Devs know how maths works?
-
You got me beat. Late '60s PDP-8/I PDP-8/I display[^] But then made up for it by being a computer operator for three Univac 418's. Univac 418's[^] That's me, mid '70s, with three computers to oversee, I had to be fast enough to be two places at once :laugh: Later, when I wrote the Biorhythm cartridge for the Bally Home Arcade (later Astrocade), I had to write a multi-byte binary multiply and divide math package to do the date calculations. I wish I had known how to do that in high school on the PDP-8/I, I ended up using the EAE (Extended Arithmetic Element) hardware to do date calculations. Turns out, every once in a while, a divide would take too long and the processor would then miss interrupts (really, really bad for the timesharing system it was running).
Psychosis at 10 Film at 11
I loved that ascii art Einstein portrait. If you don't know how the math works you will get it wrong. Consider floating point on the Univac 36-bit machines vs the Honeywell 36-bit machines. I was tasked with porting a computationally heavy application suite. As I recall, the Univac floating point registers were the same size and format as the memory values. The Honeywell registers were a different size, so you lost bits of precision when you stored a value. The result of calculations in registers did not compare equal to the same value after it had been stored to memory. Not only did I need to know how computer math worked, but how it was implemented on two different machines.
-
No reason to get angry and nice to hear that you are doing fine. And how could I overlook that people working on binary algorithmic calculating machines only need to know about those fundamentals when they are interested in it? Let's see if we can also find somebody who gets by perfectly without needing to know about the algorithmic part.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011CDP1802 wrote:
And how could I overlook that people working on binary algorithmic calculating machines only need to know about those fundamentals when they are interested in it?
No idea what your point is since there is a great deal of programming that can be accomplished without understanding binary arithmetic (presumably that is what you are referring to.) Just as there is a great deal that can be accomplished without knowing how the hard drive works. Certainly in the past 10 years in my problem domains a knowledge of threads and data structures is needed even day to day and has a far greater impact than binary arithmetic. I have used binary arithmetic seldom and I suspect that some other idioms although more complex (in code not knowledge) could have been substituted easily. Could be that your problem domain requires a more extensive usage. But I haven't seen anything to suggest that is true for a majority of problem domains. Excluding perhaps the embedded domain.
-
Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny
Depends on the problem domains. For example in financial sectors you need to understand exactly how floating point numbers work on a computer. But you also need to understand such things a what 'rounding' means in terms of the computer as well as in terms of business domains (which are not computer driven.) But specific in depth knowledge of integer arithematic would not be needed. On the other hand there are probably domains where it is essential such as in embedded controllers which are likely to use bits to control functionality.
-
n.podbielski wrote:
I think it's not really dev work
Math is not your work, it's knowledge that makes you better at your work. It's not required for drawing forms or manipulating Xml, but it helps a lot when you need to implement/understand an algorithm. Try writing your own
BigInt
in .NET 2, or Google for 'encryption' in VB6 - the latter will most likely give examples that perform a calculation on a string. It helps in understanding that a Guid is merely a large number, why there's a difference in text-encodings, and why theor
is used in C# to "add" enums together (BindingFlags.Public | BindingFlags.Instance)Bastard Programmer from Hell :suss:
Eddy Vluggen wrote:
Try writing your own
BigInt
in .NET 2, or Google for 'encryption' in VB6 - the latter will most likely give examples that perform a calculation on a string. It helps in understanding that a Guid is merely a large number, why there's a difference in text-encodings, and why theor
is used in C# to "add" enums together (BindingFlags.Public | BindingFlags.Instance)I don't understand why you think that i don't know that... and yes i do know (or i have the idea what you have in mind). What i meant was that you don't need to know binary, or octal or hex math and number notation in day to day programming. Look at tools that people use this days: high level languages, with garbage collectors, auto memory managment, high level of abstraction, aside from hardware and platform, tools for generating code, ORMS so they don't need to know DB, frameworks so thay don't need to write code for common problems and algorithms. Industry don't need for people (much) with that kind of skills. IMHO industry needs coders, programmers, who write some class or method that validate user input for some custom value, and or other things are resolved by framework or some tools or libraries. Sorting algorithms? Hashing algorithms? Binary operations? Who use that? People whom write frameworks? Or whom use this frameworks? How many JAVA programmers knows x86 registers? assembly instructions? Thay just don't need that...
Eddy Vluggen wrote:
why the
or
is used in C# to "add" enums together (BindingFlags.Public | BindingFlags.Instance)Not enums. Numbers that are power of 2.
In soviet Russia code debugs You!
-
Eddy Vluggen wrote:
Try writing your own
BigInt
in .NET 2, or Google for 'encryption' in VB6 - the latter will most likely give examples that perform a calculation on a string. It helps in understanding that a Guid is merely a large number, why there's a difference in text-encodings, and why theor
is used in C# to "add" enums together (BindingFlags.Public | BindingFlags.Instance)I don't understand why you think that i don't know that... and yes i do know (or i have the idea what you have in mind). What i meant was that you don't need to know binary, or octal or hex math and number notation in day to day programming. Look at tools that people use this days: high level languages, with garbage collectors, auto memory managment, high level of abstraction, aside from hardware and platform, tools for generating code, ORMS so they don't need to know DB, frameworks so thay don't need to write code for common problems and algorithms. Industry don't need for people (much) with that kind of skills. IMHO industry needs coders, programmers, who write some class or method that validate user input for some custom value, and or other things are resolved by framework or some tools or libraries. Sorting algorithms? Hashing algorithms? Binary operations? Who use that? People whom write frameworks? Or whom use this frameworks? How many JAVA programmers knows x86 registers? assembly instructions? Thay just don't need that...
Eddy Vluggen wrote:
why the
or
is used in C# to "add" enums together (BindingFlags.Public | BindingFlags.Instance)Not enums. Numbers that are power of 2.
In soviet Russia code debugs You!
n.podbielski wrote:
I don't understand why you think that i don't know that... and yes i do know (or i have the idea what you have in mind).
Just examples to convey the general idea.
n.podbielski wrote:
What i meant was that you don't need to know binary, or octal or hex math and number notation in day to day programming.
Depends on what you consider to be "day to day programming".
n.podbielski wrote:
How many JAVA programmers knows x86 registers? assembly instructions? Thay just don't need that...
They're in a Virtual Machine, there's an abstraction layer there. How many .NET programmers can read IL? It's not required in day-to-day coding perhaps, but it is beneficial knowledge. The same goes for math; and understanding something is always better than relying on abstractions created by others.
n.podbielski wrote:
Not enums. Numbers that are power of 2.
:)
Bastard Programmer from Hell :suss:
-
Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny
I have had numerous co-workers who are software developers, and a lot of them make the comment or ask the question: "Do you really need to know math to do this stuff?" Well, I cannot answer that because they are my co-workers and I need to get along with them since personality is part of the performance rating. If I were to answer that question honestly (albeit with a bit of viciousness), I would say, "It is not required if you (they) are to produce anything which impresses me, because up to this point, you (they) apparently have not. I am guessing that lack of mathmatical prowess has some influence on this situation which would be a dilemma to me but apparently is unimportant to you." Back when I was young and stupid, and openly arrogant and cruel, I believed I could get by by smoking the competition. The good thing is I learned over time this was a bad behavior, driven by some personal issue unrelated to the recipient of this attack. So now, I'm just old and stupid. Big improvement...
-
I might fit into the above category. I code and I take it very seriously. I did not start my working like as a coder, in fact far from it. I have no formal education as a coder but I have taught myself enough to get where I am. I constantly strive to improve my code and expand what I know. On those very rare occassions when I have free time I read as many tech articles and/or books on coding and theory as I can. But my math skills are still pretty poor. I know this and I accept that I will have to do crunchtime research every time a hex issue or binary issue pops up. No, it's not the best approach but it can work. I am at my job six years now and am writing some pretty important software for my client. FWIW, I am also thankful that there are folks smarter than I am willing to share what they know about these topics. Please remember that not everyone who doesn't do well at math is a script-kiddy slacker parasite just waiting to have you do their work for them. I realize that is NOT what you said, but I have to admit to frequently getting that feeling from many of the posts here on different topics. Maybe you math folks are just smarter than us none-too-good at math folks? I am willing to concede that point. But I would wager that many of us DO know that a clear understanding of the basics of computer math is important and that we DO try. We don't always succeed, but we try. Kurt
So you really think I'm some arrogant bastard? That's not true, at least I hope so. And with the abundance of things which may be good to know, I also have nothing against the practical approach to tackle the problems as they come. But please don't ask me a question, call me an oldfashioned fool when you don't like my answer and come running to me again when a similar situation arises. Already 33 years ago, when I typed my first code, some people were already singing the song of 'nowadays we have this and that and in the future we are also going to have something else, and that's why we will not ever have to care about xxx anymore'. Now I will not grin too much about how this and that have long since disappeared, or how something else never quite happened. What really matters is, that xxx, which we supposedly never had to care about again, still is there. Boolean algebra is one of the things that are present at all levels. Down at the hardware, in the CPU's machnie code (even in the microcode used to implement the machine instructions), in every high level language there has ever been and also in the frameworks. Why is that so? Why are languages with limited support for boolean algebra (like early BASIC interpreters) seen as restrictive? Why has this not been covered up with a few layers of framework, so that we can successfully pretend it does not exist? Believe me, it is a very fundamental thing. You use it every time you do an integer operation. You use it in every condition where you use some 'AND' or 'OR' operator. From a professional I would expect not having to rely on guessing for those things. I simply don't believe that anybody can get very far without having to invest far more effort in fumbling around it than in simply learning it. Not that it's really so much to learn anyway.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011 -
You may not need that low level stuff in 99% of your working life but it is the most fundamental thing that our business is based on. So, just to be a good professional you need to know it and understand it well (besides it's not that complex). It is just a matter of being capable, well educated about your profession and reduce the (high) level of ignorance in our society today.
I agree, except that the low level stuff is always there and you do use it every time you start a line with 'if'. How is someone to write a more complex condition without understanding boolean algebra? Living happily without it is an illusion.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011 -
CDP1802 wrote:
And how could I overlook that people working on binary algorithmic calculating machines only need to know about those fundamentals when they are interested in it?
No idea what your point is since there is a great deal of programming that can be accomplished without understanding binary arithmetic (presumably that is what you are referring to.) Just as there is a great deal that can be accomplished without knowing how the hard drive works. Certainly in the past 10 years in my problem domains a knowledge of threads and data structures is needed even day to day and has a far greater impact than binary arithmetic. I have used binary arithmetic seldom and I suspect that some other idioms although more complex (in code not knowledge) could have been substituted easily. Could be that your problem domain requires a more extensive usage. But I haven't seen anything to suggest that is true for a majority of problem domains. Excluding perhaps the embedded domain.
The point is that boolean algebra is the very fundament on which the entire machine has been built and does not go away because you want it to. On hardware level algorithms are implemented with logical gates. The microprocessor is an implementation of an algorithm that allows you to formulate algorithms in software. It is also built on boolean gates and its machine code again is built on boolean algebra. It does not matter how many layers of frameworks you have and how abstract your programming language may be, boolean algebra is always there. It's not even hidden very well. You use it whenever you perform some operation on an integer value or in every condition of an 'if' statement. Those things translate almost 1:1 into machine instructions and operation on some hardware registers. Isn't it strange that no language or framework was successfully able to abstract that away? Now, please tell me, what do you think of a developer who can't predict what happens after an operation on some data type or who is not so sure about how to formulate any non-trivial condition.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011 -
I'm interested, but seeing as it wouldn't really benefit my current job (which tends to be dealing with users asking questions, deploying things and going to meetings 95% of the time and very little coding or design, plus I keep hearing murmurings that mean I might soon be forced to work with sharepoint), I never end up going into the low-level side that much. Shame really as it's the inner workings that interest me more. In these days of 8-core CPUs of which the busiest core is typically never more than 2% busy under normal use, I'd guess there aren't many situations that call for that kind of thing anymore though.
When you are the general, it's easy to say that you don't have to know how to shoot as long as your soldiers do :) Seriously, if you look behind the compilers and framework, you will see many things which are wasteful and inefficient. Those things have become common practice because we have machines with strong processors and huge amounts of memory. The problem is that many developers quickly become helpless when they have to make do with slightly less of everything.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011 -
Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny
I believe around '78-'79. I started programming late in '74. It did get into precision and my matrix math class got into some non-"well formed" computer math problems (First time I could get near a computer. The community college I started at wouldn't let students near their computer, it was too valuable.) Hired by Boeing in '76. Cobol coding with IBM. First thing they did was send me to JCL class. Didn't learn anything I didn't know. Transfered to a new group in '78. They sent me to class because I would be working with CDC machines and FORTRAN. Learned to read dumps using octal output and how coding used memory and registers to actually do the math, minor changes in FORTRAN. Not too much about the architecture of the CDC, just that it used 60 bit words, so it worked well with octal. Around '80 I was about to become a tech rep. This was a different teaching experience. I had three, sometimes four teachers for two weeks. Each teacher would go 1 on 1 for 2 hours straight and then either a 4th teacher for a final 2 hours or two teachers taking on another hour each. I learned about 1's/2's comp, IBM's structure, more detail about the physical structure of the CDC and how data moved, ditto for the CRAY and IBM, in-depth instruction on the precision and workings of 60, 64, and 32 bit machines. Finite element analysis concepts and 3 of it's languages. At that time, defining different integer types wasn't used much. They made it very clear about the superior precision both CDC and the CRAY had over the joke in the engineering world - IBM. You defined an integer or real, it used the structure of the machine to define it. I think I learned more in those 2 weeks than the 2 years of what I thought was a brutal workload at WSU. Took some classes on my own in 2004. The binary instruction I got was a joke (never covered the sign bit) and the rest of the students couldn't care less about the little info that was given.
-
Iain Clarke, Warrior Programmer wrote:
Whippersnapper!
Mewling infant! Mid 60s on this machine[^]; I'm not in any of the photos but the dark haired guy in the first picture was my shift leader.
The best things in life are not things.
It's reassuring to know that they let you use computers in the retirement home :)
-
Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny
Just taking things a bit further than boolean algebra. In terms of Big-O notation some basic knowledge about log, pow etc. Just being able to say "Oh, this algorithm is more efficient than that one" (even if you can't quantify by how much) is pretty important. Although I never use it at my job (which casts one vote); I wonder how useful calculus is useful in the industry. Game dev (which I dabble in) is 99.9999% calculus so it obviously applies there - but has anyone seen advanced calculus feature in LOB applications? Knowing (not just understanding) statistics and set theory can help you improve your code and save you a large amount of IO overhead (and possibly CPU cycles). Clearly some boolean algebra would help. Simplifying boolean expressions (e.g. De Morgan) makes code easier to understand and can shave off CPU cycles if your compiler isn't that smart enough to do it itself. Considering that computer science started off as experiments in mathematics (and is in most/some universities classified under maths) it's no surprise that having more mathematical knowledge can make you a better programmer.
He who asks a question is a fool for five minutes. He who does not ask a question remains a fool forever. [Chineese Proverb] Jonathan C Dickinson (C# Software Engineer)
-
It's reassuring to know that they let you use computers in the retirement home :)
-
So you really think I'm some arrogant bastard? That's not true, at least I hope so. And with the abundance of things which may be good to know, I also have nothing against the practical approach to tackle the problems as they come. But please don't ask me a question, call me an oldfashioned fool when you don't like my answer and come running to me again when a similar situation arises. Already 33 years ago, when I typed my first code, some people were already singing the song of 'nowadays we have this and that and in the future we are also going to have something else, and that's why we will not ever have to care about xxx anymore'. Now I will not grin too much about how this and that have long since disappeared, or how something else never quite happened. What really matters is, that xxx, which we supposedly never had to care about again, still is there. Boolean algebra is one of the things that are present at all levels. Down at the hardware, in the CPU's machnie code (even in the microcode used to implement the machine instructions), in every high level language there has ever been and also in the frameworks. Why is that so? Why are languages with limited support for boolean algebra (like early BASIC interpreters) seen as restrictive? Why has this not been covered up with a few layers of framework, so that we can successfully pretend it does not exist? Believe me, it is a very fundamental thing. You use it every time you do an integer operation. You use it in every condition where you use some 'AND' or 'OR' operator. From a professional I would expect not having to rely on guessing for those things. I simply don't believe that anybody can get very far without having to invest far more effort in fumbling around it than in simply learning it. Not that it's really so much to learn anyway.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011>>So you really think I'm some arrogant bastard? No, and I’m honestly sorry if that’s what came across. >> But please don't ask me a question, call me an oldfashioned fool when you >>don't like my answer and come running to me again when a similar situation arises. I actually agree 100% with that. What I had hoped to convey was that for the past few years that I have lurked on forums like CP and others, I have seen a fairly consistent trend of advanced coders who insist that if you do not know theory X or understand equation Y then you are not a serious programmer and cannot consider yourself one. I completely agree than an incurious mind has no place in coding, and to that end I am constantly learning all I can. I don’t think you are arrogant but I do think that many advanced coders are so steeped in what they know, so good at what they do, that they forget not everyone is the same as them. They forget that not everyone has had the same experiences and education. For instance, I have absolutely no formal education in coding. Everything I know was learned through books, forums and trial-and-error. I do not think, by any stretch, this is the best way to learn but it is exactly what it is. I have worked very, very hard at what I do and I honestly wish I was younger so that I could have even more time to learn this craft. >>From a professional I would expect not having to rely on guessing for those things. Agreed. As I said, when those problems arise I do the research. Sometimes I have to do the research more than once for a similar problem because my mind is so full of other things that I don’t recall the exact answer from the last time. So I encounter an issue and don’t know the answer. I research to find that answer. But my bosses want to know why X is taking longer than anticipated, so with that in mind I implement what I consider to be a solution. Is it the best solution? Maybe not. But there are also practical limits on the time I can spend saturating myself in any given topic. I often have to revisit a topic later, when time permits, to really dig into the answer. >>I simply don't believe that anybody can get very far without having to invest far >>more effort in fumbling around it than in simply learning it. Not that it's really so >>much to learn anyway. That’s exactly what I am addressing above. Maybe it’s more a matter of poor time management in my case, but there are limits on how long I can delve into any topic before I need to move on. I would just ask that folks keep
-
The point is that boolean algebra is the very fundament on which the entire machine has been built and does not go away because you want it to. On hardware level algorithms are implemented with logical gates. The microprocessor is an implementation of an algorithm that allows you to formulate algorithms in software. It is also built on boolean gates and its machine code again is built on boolean algebra. It does not matter how many layers of frameworks you have and how abstract your programming language may be, boolean algebra is always there. It's not even hidden very well. You use it whenever you perform some operation on an integer value or in every condition of an 'if' statement. Those things translate almost 1:1 into machine instructions and operation on some hardware registers. Isn't it strange that no language or framework was successfully able to abstract that away? Now, please tell me, what do you think of a developer who can't predict what happens after an operation on some data type or who is not so sure about how to formulate any non-trivial condition.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011CDP1802 wrote:
The point is that boolean algebra is the very fundament on which the entire machine has been built and does not go away because you want it to.
The hard drive and monitor are fundamental as well. And I said nothing about wanting it to go away.
CDP1802 wrote:
On hardware level algorithms are implemented with logical gates. The microprocessor is an implementation of an algorithm that allows you to formulate algorithms in software. It is also built on boolean gates and its machine code again is built on boolean algebra. It does not matter how many layers of frameworks you have and how abstract your programming language may be, boolean algebra is always there. It's not even hidden very well.
Sigh...I didn't say I don't know how it works. I said that in day to day usage that binary arithmetic is seldom used by me. And I seriously doubt the software development that occurs using desktop systems and even mobile devices requires knowledge of any of that. And given that I was trained as an electrical engineer I feel that I am competent to judge what impact my knowledge of how gates work does impact my day to day activities as a software engineer. (Zero by the way if it wasn't clear.)
CDP1802 wrote:
You use it whenever you perform some operation on an integer value or in every condition of an 'if' statement. Those things translate almost 1:1 into machine instructions and operation on some hardware registers. Isn't it strange that no language or framework was successfully able to abstract that away?
That of course is absolutely irrelevant. No more so than how many gate transactions occur in the CPU to achieve that same operation.
CDP1802 wrote:
Now, please tell me, what do you think of a developer who can't predict what happens after an operation on some data type or who is not so sure about how to formulate any non-trivial condition.
I would say that that statement does not in any way follow from what we are discussing. There is a huge, vast leap in logic which is totally missing there.
-
punch card chad on my oatmeal :laugh:
Steve _________________ I C(++) therefore I am
High in fiber!
-
Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny
I'm 17. I first drew lines with a for loop on a beautiful ZX Spectrum +2 128k 6 years ago. I recently obtained a BBC Master and Dot Matrix printer which was lying around in the school biology lab since '88, complete with green CRT monitor, and ribbon print cartridges. I won't lie - my friend and I have 4 ZX Spectrums between us. I was lucky, dead lucky. I learnt to program in C for the Nintendo DS with 4mb of RAM. I still carefully selected between int sizes in .NET, and waste nary a bit or a cycle. The n00bs I see learning to program today have to jump in right at the top (you get me) and write the biggest mess, which still works fine. In fact, it works just as well as tight efficient code and they know no better. People should know the math.