Should Devs know how maths works?
-
Not everyone is like you. What you think should be important may not be important to me or the next guy. I do very well for myself in my profession (on all levels). To slight me because I don't get off on 0's and 1's is lame. Instead of talking smack about your intern and crying about it, why don't you take the time to show this person the connection between the 1's and 0's and why they are important.
-- ** You don't hire a handyman to build a house, you hire a carpenter. ** Jack of all trades and master of none.
No reason to get angry and nice to hear that you are doing fine. And how could I overlook that people working on binary algorithmic calculating machines only need to know about those fundamentals when they are interested in it? Let's see if we can also find somebody who gets by perfectly without needing to know about the algorithmic part.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011 -
No reason to get angry and nice to hear that you are doing fine. And how could I overlook that people working on binary algorithmic calculating machines only need to know about those fundamentals when they are interested in it? Let's see if we can also find somebody who gets by perfectly without needing to know about the algorithmic part.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011Are we debating algorithms or binary math? Are we debating logic or 0's or 1's? I can't tell anymore. ...I'm not angry. However, you sound like we all "need" to know this stuff because "you" know it and I am just saying that is plain silly.
-- ** You don't hire a handyman to build a house, you hire a carpenter. ** Jack of all trades and master of none.
-
Danny Martin wrote:
nineties while working with 68k assembler
Whippersnapper! Early 80s, z80 & 6502. Ah, the days of knowing 1's and 2's complement, and hexadecimal... Iain.
I am one of "those foreigners coming over here and stealing our jobs". Yay me!
I love the smell of a hex dump in the morning.
"Life can only be understood backwards, but it must be lived forward." Kierkegaard, Søren
-
Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny
I personally see both sides of the fence here. Ultimately, it depends on what you want to know. As the world of computer science grows larger and larger, it is important for people to specialize. I personally don't know how the NT file system writes out chunks of a file or how a monitor is transmitted the information I'm seeing on this screen, but it doesn't hinder my ability to develop an application (at least until I run across a problem requiring such knowledge). The great thing about computer science is this. We can stand upon the backs of giants to achieve something even greater. I don't have to know everything about everything. I just have to know what is relevant to my task at hand. Now, it doesn't prevent my own curiosity from taking me into uncharted waters. In the end, the few courses that I touched on in college about that were among my favorite. I find it interesting, but in no way force that we should all know.
modified on Wednesday, June 1, 2011 10:20 AM
-
No reason to get angry and nice to hear that you are doing fine. And how could I overlook that people working on binary algorithmic calculating machines only need to know about those fundamentals when they are interested in it? Let's see if we can also find somebody who gets by perfectly without needing to know about the algorithmic part.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011What I'm saying here, and is being echoed by others, is that you don't need to know that a computer uses AND / XOR to do addition in order to write a function which adds two numbers. I believe there are many devs out there that don't know, don't need to know and quite frankly could care less. You don't need to know how to write a for next loop in assembler in order to write one in another, higher level language, but if you 'do' know how it works at the machine level I think it gives you a greater insight into how those higher level languages do their thing. it certainly helped me to understand what was going on under the hood. Danny
-
What I'm saying here, and is being echoed by others, is that you don't need to know that a computer uses AND / XOR to do addition in order to write a function which adds two numbers. I believe there are many devs out there that don't know, don't need to know and quite frankly could care less. You don't need to know how to write a for next loop in assembler in order to write one in another, higher level language, but if you 'do' know how it works at the machine level I think it gives you a greater insight into how those higher level languages do their thing. it certainly helped me to understand what was going on under the hood. Danny
Danny Martin wrote:
You don't need to know how to write a for next loop in assembler in order to write one in another, higher level language, but if you 'do' know how it works at the machine level I think it gives you a greater insight into how those higher level languages do their thing. it certainly helped me to understand what was going on under the hood.
Well said. :thumbsup:
-- ** You don't hire a handyman to build a house, you hire a carpenter. ** Jack of all trades and master of none.
-
What I'm saying here, and is being echoed by others, is that you don't need to know that a computer uses AND / XOR to do addition in order to write a function which adds two numbers. I believe there are many devs out there that don't know, don't need to know and quite frankly could care less. You don't need to know how to write a for next loop in assembler in order to write one in another, higher level language, but if you 'do' know how it works at the machine level I think it gives you a greater insight into how those higher level languages do their thing. it certainly helped me to understand what was going on under the hood. Danny
Certainly. The closer you get to the hardware, the more you must know. If you happen to work on, let's say, a driver for some piece of hardware, then it becomes absolutely essential. Then think about what you do every day with integer types. Type casts, conversions, signed vs. unsigned or, when importing binary data, little endian vs. big endian. How is someone supposed to deal with all the little pitfalls involved with those things without knowing anything about binary arithmatic? Nice frameworks and programming languages cannot protect you from everything.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011 -
Are we debating algorithms or binary math? Are we debating logic or 0's or 1's? I can't tell anymore. ...I'm not angry. However, you sound like we all "need" to know this stuff because "you" know it and I am just saying that is plain silly.
-- ** You don't hire a handyman to build a house, you hire a carpenter. ** Jack of all trades and master of none.
It does not matter what I know or not. I just say that those things are so fundamental that you must deal with them, like it or not. The only alternative would be to become a cargo cult programmer who strictly follows rules from people who (hopefully) know what they are talking about. I prefer not to end this way.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011 -
Certainly. The closer you get to the hardware, the more you must know. If you happen to work on, let's say, a driver for some piece of hardware, then it becomes absolutely essential. Then think about what you do every day with integer types. Type casts, conversions, signed vs. unsigned or, when importing binary data, little endian vs. big endian. How is someone supposed to deal with all the little pitfalls involved with those things without knowing anything about binary arithmatic? Nice frameworks and programming languages cannot protect you from everything.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011CDP1802 wrote:
The closer you get to the hardware, the more you must know. If you happen to work on, let's say, a driver for some piece of hardware, then it becomes absolutely essential.
I agree, but I was just talking generally...
CDP1802 wrote:
Nice frameworks and programming languages cannot protect you from everything.
There, however, I disagree. You can, for example, be a very successful web developer and not care about the difference between an integer and string in some instances, take PHP... I feel that knowing the fundamental stuff is advantageous, but it's by no means essential to work in a higher level language that does a majority of the work for you. Many replies here seem to bear that point out pretty well. Danny
-
CDP1802 wrote:
The closer you get to the hardware, the more you must know. If you happen to work on, let's say, a driver for some piece of hardware, then it becomes absolutely essential.
I agree, but I was just talking generally...
CDP1802 wrote:
Nice frameworks and programming languages cannot protect you from everything.
There, however, I disagree. You can, for example, be a very successful web developer and not care about the difference between an integer and string in some instances, take PHP... I feel that knowing the fundamental stuff is advantageous, but it's by no means essential to work in a higher level language that does a majority of the work for you. Many replies here seem to bear that point out pretty well. Danny
By my experience that also accounts for a few new posts in the coding horrors section, usually committed by somebody who is totally unaware of what's so horrible about it. And, my personal favorite, experienced developers quickly looking the other way and leaving the dirty work to that rambling idiot who keeps ranting about how important that stuff is.
"I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011 -
Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny
No. I knew once back in the day when I cut my teeth doing assembly programming and a bit before that when I did a little electronics as a hobby and played with logic chips, these days as a developer using any modern language it's *utterly* irrelevant and if someone wants to take it up as a hobby good on them but that's it.
There is no failure only feedback
-
I love the smell of a hex dump in the morning.
"Life can only be understood backwards, but it must be lived forward." Kierkegaard, Søren
punch card chad on my oatmeal :laugh:
Steve _________________ I C(++) therefore I am
-
They should also know how a computer works, and how the VM or runtime they are targeting works.
cheers, Chris Maunder The Code Project | Co-founder Microsoft C++ MVP
Chris Maunder wrote:
They should also know how a computer works, and how the VM or runtime they are targeting works.
...and the quaulity and quantity and specific type of mud to use in the pies.
Michael Martin Australia "I controlled my laughter and simple said "No,I am very busy,so I can't write any code for you". The moment they heard this all the smiling face turned into a sad looking face and one of them farted. So I had to leave the place as soon as possible." - Mr.Prakash One Fine Saturday. 24/04/2004
-
Chris Maunder wrote:
They should also know how a computer works, and how the VM or runtime they are targeting works.
...and the quaulity and quantity and specific type of mud to use in the pies.
Michael Martin Australia "I controlled my laughter and simple said "No,I am very busy,so I can't write any code for you". The moment they heard this all the smiling face turned into a sad looking face and one of them farted. So I had to leave the place as soon as possible." - Mr.Prakash One Fine Saturday. 24/04/2004
Well, obviously.
cheers, Chris Maunder The Code Project | Co-founder Microsoft C++ MVP
-
Well, obviously.
cheers, Chris Maunder The Code Project | Co-founder Microsoft C++ MVP
Might end up getting a job down here in Canberra with the customer I'm currently working on site with. If it looks more certain (and it is looking more than promising at the moment) I will need to have a chat with you about the areas (if any) to avoid living and any other Canberra advise you may have.
Michael Martin Australia "I controlled my laughter and simple said "No,I am very busy,so I can't write any code for you". The moment they heard this all the smiling face turned into a sad looking face and one of them farted. So I had to leave the place as soon as possible." - Mr.Prakash One Fine Saturday. 24/04/2004
-
Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny
Danny, I see two circumstances when programmers need to be aware of the inner workings of computer arithmetic: - when they face the limitations of the finite representation (know about overflow and inaccuracies resulting from trunction issues), - when they need to use optimization "tricks" related to the specifics of the representation (such as trading a shift for a division by a power of 2). Besides that, being cultured never harms, does it ? For the brave ones: http://www.validlab.com/goldberg/paper.pdf
-
What I'm saying here, and is being echoed by others, is that you don't need to know that a computer uses AND / XOR to do addition in order to write a function which adds two numbers. I believe there are many devs out there that don't know, don't need to know and quite frankly could care less. You don't need to know how to write a for next loop in assembler in order to write one in another, higher level language, but if you 'do' know how it works at the machine level I think it gives you a greater insight into how those higher level languages do their thing. it certainly helped me to understand what was going on under the hood. Danny
You may not need that low level stuff in 99% of your working life but it is the most fundamental thing that our business is based on. So, just to be a good professional you need to know it and understand it well (besides it's not that complex). It is just a matter of being capable, well educated about your profession and reduce the (high) level of ignorance in our society today.
-
What I'm saying here, and is being echoed by others, is that you don't need to know that a computer uses AND / XOR to do addition in order to write a function which adds two numbers. I believe there are many devs out there that don't know, don't need to know and quite frankly could care less. You don't need to know how to write a for next loop in assembler in order to write one in another, higher level language, but if you 'do' know how it works at the machine level I think it gives you a greater insight into how those higher level languages do their thing. it certainly helped me to understand what was going on under the hood. Danny
Danny Martin wrote:
computer uses AND / XOR
I think that on x86 it would be few stack operation (push and pops instructions), one call, one add and one ret :)
In soviet Russia code debugs You!
-
Danny Martin wrote:
but how many devs know how they work it out?
Ideally a dev would learn to do additions in a different representation than decimal.
Danny Martin wrote:
How many care? Should we know?
It's not required knowledge for the average LOB-app. One can work with dates for years without knowing what an epoch is, or the difference between a directory and a folder.
Danny Martin wrote:
If you know, how did you find out
The Library :)
Bastard Programmer from Hell :suss:
Eddy Vluggen wrote:
Ideally a dev would learn to do additions in a different representation than decimal.
I think it's not really dev work. I learned this in high school (with electronics specialization) on microprocessors classes. And again on university on classes about assembler x86 and 51 miprocessors. So this is really another job. Unless you are developing mP, mC apps in assebly :)
In soviet Russia code debugs You!
-
I learned it in one of my Freshman/Sophomore level CS classes. My lecturer for the class didn't know that 0.1 (decimal) was a repeating decimal in binary. :doh: After the lecture I had to demonstrate it by working the division longhand through 2 or 3 repeats and then by converting the repeating decimal back into a fraction.
3x12=36 2x12=24 1x12=12 0x12=18
:| on what university you graduated?
In soviet Russia code debugs You!