Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Should Devs know how maths works?

Should Devs know how maths works?

Scheduled Pinned Locked Moved The Lounge
sharepointquestiondiscussion
95 Posts 45 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • E edmurphy99

    the ancient one, he who speaks of floppy disks I first used the PDP-7

    M Offline
    M Offline
    MarvinMartian
    wrote on last edited by
    #69

    Data General 1401 as best I can recall. Had to toggle in a JPUN to kickstart the KSR terminal.

    1 Reply Last reply
    0
    • L Lost User

      Iain Clarke, Warrior Programmer wrote:

      Whippersnapper!

      Mewling infant! Mid 60s on this machine[^]; I'm not in any of the photos but the dark haired guy in the first picture was my shift leader.

      The best things in life are not things.

      B Offline
      B Offline
      BrainiacV
      wrote on last edited by
      #70

      You got me beat. Late '60s PDP-8/I PDP-8/I display[^] But then made up for it by being a computer operator for three Univac 418's. Univac 418's[^] That's me, mid '70s, with three computers to oversee, I had to be fast enough to be two places at once :laugh: Later, when I wrote the Biorhythm cartridge for the Bally Home Arcade (later Astrocade), I had to write a multi-byte binary multiply and divide math package to do the date calculations. I wish I had known how to do that in high school on the PDP-8/I, I ended up using the EAE (Extended Arithmetic Element) hardware to do date calculations. Turns out, every once in a while, a divide would take too long and the processor would then miss interrupts (really, really bad for the timesharing system it was running).

      Psychosis at 10 Film at 11

      L D 2 Replies Last reply
      0
      • D Danny Martin

        Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny

        Q Offline
        Q Offline
        Quirkafleeg
        wrote on last edited by
        #71

        Don't all programmers start from this? http://en.wikipedia.org/wiki/File:Principia_Mathematica_theorem_54-43.png[^]

        1 Reply Last reply
        0
        • B BrainiacV

          You got me beat. Late '60s PDP-8/I PDP-8/I display[^] But then made up for it by being a computer operator for three Univac 418's. Univac 418's[^] That's me, mid '70s, with three computers to oversee, I had to be fast enough to be two places at once :laugh: Later, when I wrote the Biorhythm cartridge for the Bally Home Arcade (later Astrocade), I had to write a multi-byte binary multiply and divide math package to do the date calculations. I wish I had known how to do that in high school on the PDP-8/I, I ended up using the EAE (Extended Arithmetic Element) hardware to do date calculations. Turns out, every once in a while, a divide would take too long and the processor would then miss interrupts (really, really bad for the timesharing system it was running).

          Psychosis at 10 Film at 11

          L Offline
          L Offline
          Lost User
          wrote on last edited by
          #72

          BrainiacV wrote:

          operator for three Univac 418's

          I graduated as operator from the LEO III to a Univac 1108, thence to programming and the rest is history ...

          The best things in life are not things.

          1 Reply Last reply
          0
          • D Danny Martin

            Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny

            S Offline
            S Offline
            SeattleC
            wrote on last edited by
            #73

            Depends what you mean. Do you mean, "Do I know how addition and subtraction occur in 2's complement?" Do you mean, "Do I know what assembler instructions propogate the carry?" Do you mean, "Do I know how a carry-lookahead adder is implemented in logic gates?" These are pregressively deeper and deeper knowledge of how a computer does math(s). I've needed to know how 2's complement math works frequetly in my career. I've obviously had to know what assember instructions do the math and just how they work at least once or twice. I happen to know how a carry lookahead adder works, but I've never had to use that knowledge, since I prefer to keep my big clumsy fingers out of the actual hardware. The first topic were covered in detail in my CS undergrad coursework in the late '70's. The middle topic I got exposed to during a horrible machine-language project for pay in the late '70's. I'm talking M6800 machine language, 16kbytes RAM, no disk, paper tape, patch-the-binaries-because-the-source-code-didn't-fit-in-RAM, ahh-run-screaming! The third topic, I first encountered in 9th grade, when I was so bored in school that I began designing computer circuits to avoid falling asleep. I had this "Build Your Own Working Digital Computer" book. The topic was also covered in my CS coursework.

            1 Reply Last reply
            0
            • D Danny Martin

              Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny

              P Offline
              P Offline
              patbob
              wrote on last edited by
              #74

              I do. Both kinds (integer & floating point). I learned how interger math happened back during my high school years, when my dad and I were both learning about computers. He was more into the theory and taught me how 2's complement math worked and how logic gates could be wired together to implement it. I was never taught that level of detail again. I didn't learn how the innards of an FPU worked until college, when I was asked to write a test for the FPU hardware of the college's Vaxs (apparently they failed from time to time). It even found a failure in a live system :) As for boolean logic, that was also during my high school years. Math class taught the expressions. I played with logic gates for fun outside of school. I found one of my dad's books that had circuit diagrams for the elecronics inside the gates that I studied until I understood them. My thoughts: if a developer doesn't understand the level beneath the one they program to (i.e. the one they can debug and fix), there will be times when they are completely ineffective. We don't get paid to be ineffective. Understanding two or more levels deeper won't help (unless there's the possibility of being able to debug at that level too), so isn't necessary. So, here's a joke absolutely every developer should find amusing.. How many times will this (C) loop iterate? for (float f = 0.0f; f < 1000000000.0; f = f + 1.0f) { } Note: float is a 4-byte IEEE floating point number.. pick an appropriate type in your favorite language. Please, nobody post the answer. If you don't know, go try it and figure it out for yourself.

              patbob

              1 Reply Last reply
              0
              • L Lost User

                Might end up getting a job down here in Canberra with the customer I'm currently working on site with. If it looks more certain (and it is looking more than promising at the moment) I will need to have a chat with you about the areas (if any) to avoid living and any other Canberra advise you may have.

                Michael Martin Australia "I controlled my laughter and simple said "No,I am very busy,so I can't write any code for you". The moment they heard this all the smiling face turned into a sad looking face and one of them farted. So I had to leave the place as soon as possible." - Mr.Prakash One Fine Saturday. 24/04/2004

                C Offline
                C Offline
                Chris Maunder
                wrote on last edited by
                #75

                Just stay clear of Tuggers. (Though it's been6 years since I lived there)

                cheers, Chris Maunder The Code Project | Co-founder Microsoft C++ MVP

                1 Reply Last reply
                0
                • D Danny Martin

                  Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny

                  K Offline
                  K Offline
                  KChandos
                  wrote on last edited by
                  #76

                  What I've found over the years is that "normal" developers don't need to know how the computer performs math operations. Their development software simply does the magic. Here's where the caveat comes in: "Normal" developers = Business Application Developers Over my career I've primarily done business application development. In the few cases where I was doing scientific development, all the math rules changed. In the scientific arena, you need to understand what's really going on so that you can: 1. Optimize operations 2. Guarantee accuracy to xx digits Sounds simple, but it's not. If you really think it's simple just take a look at some of the "Big Number" math software from places like MIT. You'll see that how the number is constructed and managed becomes very important. Here's an exercise to try: Write a program that will divide a 100-digit number by a 50-digit number. At a company that I used to work for (back around 1989) this was exactly the challenge made to all programmers in the company by the company President. The hook? We developed in Natural on the IBM mainframe. For those who don't know, Natural is a 4GL created to run primarily against an ADABAS DBMS. Both ADABAS and Natural are the IP of Software AG. Within the company only two programmers, myself and one other, came up with solutions to this problem. His was pretty quick but had a couple of numeric domain issues, mine was slower but I neglected to check for a zero divisor. Both of us were awarded a (rather nice) bottle of champagne for our effort (the company President really just wanted to know that he had people who would actually take up the challenge. As it turns out, more than half of the employees started, but only two of us came up with practical implementations that didn't attempt to "extend" the language or environment).

                  1 Reply Last reply
                  0
                  • S Slacker007

                    Not everyone is like you. What you think should be important may not be important to me or the next guy. I do very well for myself in my profession (on all levels). To slight me because I don't get off on 0's and 1's is lame. Instead of talking smack about your intern and crying about it, why don't you take the time to show this person the connection between the 1's and 0's and why they are important.

                    -- ** You don't hire a handyman to build a house, you hire a carpenter. ** Jack of all trades and master of none.

                    U Offline
                    U Offline
                    User 3760773
                    wrote on last edited by
                    #77

                    If you are going to write software that does math then you have to know how computers represent numbers and how they do math. Both the representation of numbers and the methods used to do the calculations place limitation on what you can do and how you can do it.

                    1 Reply Last reply
                    0
                    • B BrainiacV

                      You got me beat. Late '60s PDP-8/I PDP-8/I display[^] But then made up for it by being a computer operator for three Univac 418's. Univac 418's[^] That's me, mid '70s, with three computers to oversee, I had to be fast enough to be two places at once :laugh: Later, when I wrote the Biorhythm cartridge for the Bally Home Arcade (later Astrocade), I had to write a multi-byte binary multiply and divide math package to do the date calculations. I wish I had known how to do that in high school on the PDP-8/I, I ended up using the EAE (Extended Arithmetic Element) hardware to do date calculations. Turns out, every once in a while, a divide would take too long and the processor would then miss interrupts (really, really bad for the timesharing system it was running).

                      Psychosis at 10 Film at 11

                      D Offline
                      D Offline
                      Doug Henderson
                      wrote on last edited by
                      #78

                      I loved that ascii art Einstein portrait. If you don't know how the math works you will get it wrong. Consider floating point on the Univac 36-bit machines vs the Honeywell 36-bit machines. I was tasked with porting a computationally heavy application suite. As I recall, the Univac floating point registers were the same size and format as the memory values. The Honeywell registers were a different size, so you lost bits of precision when you stored a value. The result of calculations in registers did not compare equal to the same value after it had been stored to memory. Not only did I need to know how computer math worked, but how it was implemented on two different machines.

                      1 Reply Last reply
                      0
                      • L Lost User

                        No reason to get angry and nice to hear that you are doing fine. And how could I overlook that people working on binary algorithmic calculating machines only need to know about those fundamentals when they are interested in it? Let's see if we can also find somebody who gets by perfectly without needing to know about the algorithmic part.

                        "I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
                        I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011

                        J Offline
                        J Offline
                        jschell
                        wrote on last edited by
                        #79

                        CDP1802 wrote:

                        And how could I overlook that people working on binary algorithmic calculating machines only need to know about those fundamentals when they are interested in it?

                        No idea what your point is since there is a great deal of programming that can be accomplished without understanding binary arithmetic (presumably that is what you are referring to.) Just as there is a great deal that can be accomplished without knowing how the hard drive works. Certainly in the past 10 years in my problem domains a knowledge of threads and data structures is needed even day to day and has a far greater impact than binary arithmetic. I have used binary arithmetic seldom and I suspect that some other idioms although more complex (in code not knowledge) could have been substituted easily. Could be that your problem domain requires a more extensive usage. But I haven't seen anything to suggest that is true for a majority of problem domains. Excluding perhaps the embedded domain.

                        L 1 Reply Last reply
                        0
                        • D Danny Martin

                          Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny

                          J Offline
                          J Offline
                          jschell
                          wrote on last edited by
                          #80

                          Depends on the problem domains. For example in financial sectors you need to understand exactly how floating point numbers work on a computer. But you also need to understand such things a what 'rounding' means in terms of the computer as well as in terms of business domains (which are not computer driven.) But specific in depth knowledge of integer arithematic would not be needed. On the other hand there are probably domains where it is essential such as in embedded controllers which are likely to use bits to control functionality.

                          1 Reply Last reply
                          0
                          • L Lost User

                            n.podbielski wrote:

                            I think it's not really dev work

                            Math is not your work, it's knowledge that makes you better at your work. It's not required for drawing forms or manipulating Xml, but it helps a lot when you need to implement/understand an algorithm. Try writing your own BigInt in .NET 2, or Google for 'encryption' in VB6 - the latter will most likely give examples that perform a calculation on a string. It helps in understanding that a Guid is merely a large number, why there's a difference in text-encodings, and why the or is used in C# to "add" enums together (BindingFlags.Public | BindingFlags.Instance)

                            Bastard Programmer from Hell :suss:

                            N Offline
                            N Offline
                            n podbielski
                            wrote on last edited by
                            #81

                            Eddy Vluggen wrote:

                            Try writing your own BigInt in .NET 2, or Google for 'encryption' in VB6 - the latter will most likely give examples that perform a calculation on a string. It helps in understanding that a Guid is merely a large number, why there's a difference in text-encodings, and why the or is used in C# to "add" enums together (BindingFlags.Public | BindingFlags.Instance)

                            I don't understand why you think that i don't know that... and yes i do know (or i have the idea what you have in mind). What i meant was that you don't need to know binary, or octal or hex math and number notation in day to day programming. Look at tools that people use this days: high level languages, with garbage collectors, auto memory managment, high level of abstraction, aside from hardware and platform, tools for generating code, ORMS so they don't need to know DB, frameworks so thay don't need to write code for common problems and algorithms. Industry don't need for people (much) with that kind of skills. IMHO industry needs coders, programmers, who write some class or method that validate user input for some custom value, and or other things are resolved by framework or some tools or libraries. Sorting algorithms? Hashing algorithms? Binary operations? Who use that? People whom write frameworks? Or whom use this frameworks? How many JAVA programmers knows x86 registers? assembly instructions? Thay just don't need that...

                            Eddy Vluggen wrote:

                            why the or is used in C# to "add" enums together (BindingFlags.Public | BindingFlags.Instance)

                            Not enums. Numbers that are power of 2.

                            In soviet Russia code debugs You!

                            L 1 Reply Last reply
                            0
                            • N n podbielski

                              Eddy Vluggen wrote:

                              Try writing your own BigInt in .NET 2, or Google for 'encryption' in VB6 - the latter will most likely give examples that perform a calculation on a string. It helps in understanding that a Guid is merely a large number, why there's a difference in text-encodings, and why the or is used in C# to "add" enums together (BindingFlags.Public | BindingFlags.Instance)

                              I don't understand why you think that i don't know that... and yes i do know (or i have the idea what you have in mind). What i meant was that you don't need to know binary, or octal or hex math and number notation in day to day programming. Look at tools that people use this days: high level languages, with garbage collectors, auto memory managment, high level of abstraction, aside from hardware and platform, tools for generating code, ORMS so they don't need to know DB, frameworks so thay don't need to write code for common problems and algorithms. Industry don't need for people (much) with that kind of skills. IMHO industry needs coders, programmers, who write some class or method that validate user input for some custom value, and or other things are resolved by framework or some tools or libraries. Sorting algorithms? Hashing algorithms? Binary operations? Who use that? People whom write frameworks? Or whom use this frameworks? How many JAVA programmers knows x86 registers? assembly instructions? Thay just don't need that...

                              Eddy Vluggen wrote:

                              why the or is used in C# to "add" enums together (BindingFlags.Public | BindingFlags.Instance)

                              Not enums. Numbers that are power of 2.

                              In soviet Russia code debugs You!

                              L Offline
                              L Offline
                              Lost User
                              wrote on last edited by
                              #82

                              n.podbielski wrote:

                              I don't understand why you think that i don't know that... and yes i do know (or i have the idea what you have in mind).

                              Just examples to convey the general idea.

                              n.podbielski wrote:

                              What i meant was that you don't need to know binary, or octal or hex math and number notation in day to day programming.

                              Depends on what you consider to be "day to day programming".

                              n.podbielski wrote:

                              How many JAVA programmers knows x86 registers? assembly instructions? Thay just don't need that...

                              They're in a Virtual Machine, there's an abstraction layer there. How many .NET programmers can read IL? It's not required in day-to-day coding perhaps, but it is beneficial knowledge. The same goes for math; and understanding something is always better than relying on abstractions created by others.

                              n.podbielski wrote:

                              Not enums. Numbers that are power of 2.

                              :)

                              Bastard Programmer from Hell :suss:

                              1 Reply Last reply
                              0
                              • D Danny Martin

                                Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny

                                D Offline
                                D Offline
                                da808wiz
                                wrote on last edited by
                                #83

                                I have had numerous co-workers who are software developers, and a lot of them make the comment or ask the question: "Do you really need to know math to do this stuff?" Well, I cannot answer that because they are my co-workers and I need to get along with them since personality is part of the performance rating. If I were to answer that question honestly (albeit with a bit of viciousness), I would say, "It is not required if you (they) are to produce anything which impresses me, because up to this point, you (they) apparently have not. I am guessing that lack of mathmatical prowess has some influence on this situation which would be a dilemma to me but apparently is unimportant to you." Back when I was young and stupid, and openly arrogant and cruel, I believed I could get by by smoking the competition. The good thing is I learned over time this was a bad behavior, driven by some personal issue unrelated to the recipient of this attack. So now, I'm just old and stupid. Big improvement...

                                1 Reply Last reply
                                0
                                • K KurtPW

                                  I might fit into the above category. I code and I take it very seriously. I did not start my working like as a coder, in fact far from it. I have no formal education as a coder but I have taught myself enough to get where I am. I constantly strive to improve my code and expand what I know. On those very rare occassions when I have free time I read as many tech articles and/or books on coding and theory as I can. But my math skills are still pretty poor. I know this and I accept that I will have to do crunchtime research every time a hex issue or binary issue pops up. No, it's not the best approach but it can work. I am at my job six years now and am writing some pretty important software for my client. FWIW, I am also thankful that there are folks smarter than I am willing to share what they know about these topics. Please remember that not everyone who doesn't do well at math is a script-kiddy slacker parasite just waiting to have you do their work for them. I realize that is NOT what you said, but I have to admit to frequently getting that feeling from many of the posts here on different topics. Maybe you math folks are just smarter than us none-too-good at math folks? I am willing to concede that point. But I would wager that many of us DO know that a clear understanding of the basics of computer math is important and that we DO try. We don't always succeed, but we try. Kurt

                                  L Offline
                                  L Offline
                                  Lost User
                                  wrote on last edited by
                                  #84

                                  So you really think I'm some arrogant bastard? That's not true, at least I hope so. And with the abundance of things which may be good to know, I also have nothing against the practical approach to tackle the problems as they come. But please don't ask me a question, call me an oldfashioned fool when you don't like my answer and come running to me again when a similar situation arises. Already 33 years ago, when I typed my first code, some people were already singing the song of 'nowadays we have this and that and in the future we are also going to have something else, and that's why we will not ever have to care about xxx anymore'. Now I will not grin too much about how this and that have long since disappeared, or how something else never quite happened. What really matters is, that xxx, which we supposedly never had to care about again, still is there. Boolean algebra is one of the things that are present at all levels. Down at the hardware, in the CPU's machnie code (even in the microcode used to implement the machine instructions), in every high level language there has ever been and also in the frameworks. Why is that so? Why are languages with limited support for boolean algebra (like early BASIC interpreters) seen as restrictive? Why has this not been covered up with a few layers of framework, so that we can successfully pretend it does not exist? Believe me, it is a very fundamental thing. You use it every time you do an integer operation. You use it in every condition where you use some 'AND' or 'OR' operator. From a professional I would expect not having to rely on guessing for those things. I simply don't believe that anybody can get very far without having to invest far more effort in fumbling around it than in simply learning it. Not that it's really so much to learn anyway.

                                  "I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
                                  I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011

                                  K 1 Reply Last reply
                                  0
                                  • L Lazar Videnov

                                    You may not need that low level stuff in 99% of your working life but it is the most fundamental thing that our business is based on. So, just to be a good professional you need to know it and understand it well (besides it's not that complex). It is just a matter of being capable, well educated about your profession and reduce the (high) level of ignorance in our society today.

                                    L Offline
                                    L Offline
                                    Lost User
                                    wrote on last edited by
                                    #85

                                    I agree, except that the low level stuff is always there and you do use it every time you start a line with 'if'. How is someone to write a more complex condition without understanding boolean algebra? Living happily without it is an illusion.

                                    "I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
                                    I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011

                                    1 Reply Last reply
                                    0
                                    • J jschell

                                      CDP1802 wrote:

                                      And how could I overlook that people working on binary algorithmic calculating machines only need to know about those fundamentals when they are interested in it?

                                      No idea what your point is since there is a great deal of programming that can be accomplished without understanding binary arithmetic (presumably that is what you are referring to.) Just as there is a great deal that can be accomplished without knowing how the hard drive works. Certainly in the past 10 years in my problem domains a knowledge of threads and data structures is needed even day to day and has a far greater impact than binary arithmetic. I have used binary arithmetic seldom and I suspect that some other idioms although more complex (in code not knowledge) could have been substituted easily. Could be that your problem domain requires a more extensive usage. But I haven't seen anything to suggest that is true for a majority of problem domains. Excluding perhaps the embedded domain.

                                      L Offline
                                      L Offline
                                      Lost User
                                      wrote on last edited by
                                      #86

                                      The point is that boolean algebra is the very fundament on which the entire machine has been built and does not go away because you want it to. On hardware level algorithms are implemented with logical gates. The microprocessor is an implementation of an algorithm that allows you to formulate algorithms in software. It is also built on boolean gates and its machine code again is built on boolean algebra. It does not matter how many layers of frameworks you have and how abstract your programming language may be, boolean algebra is always there. It's not even hidden very well. You use it whenever you perform some operation on an integer value or in every condition of an 'if' statement. Those things translate almost 1:1 into machine instructions and operation on some hardware registers. Isn't it strange that no language or framework was successfully able to abstract that away? Now, please tell me, what do you think of a developer who can't predict what happens after an operation on some data type or who is not so sure about how to formulate any non-trivial condition.

                                      "I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
                                      I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011

                                      J 1 Reply Last reply
                                      0
                                      • D Dave Parker

                                        I'm interested, but seeing as it wouldn't really benefit my current job (which tends to be dealing with users asking questions, deploying things and going to meetings 95% of the time and very little coding or design, plus I keep hearing murmurings that mean I might soon be forced to work with sharepoint), I never end up going into the low-level side that much. Shame really as it's the inner workings that interest me more. In these days of 8-core CPUs of which the busiest core is typically never more than 2% busy under normal use, I'd guess there aren't many situations that call for that kind of thing anymore though.

                                        L Offline
                                        L Offline
                                        Lost User
                                        wrote on last edited by
                                        #87

                                        When you are the general, it's easy to say that you don't have to know how to shoot as long as your soldiers do :) Seriously, if you look behind the compilers and framework, you will see many things which are wasteful and inefficient. Those things have become common practice because we have machines with strong processors and huge amounts of memory. The problem is that many developers quickly become helpless when they have to make do with slightly less of everything.

                                        "I just exchanged opinions with my boss. I went in with mine and came out with his." - me, 2011 ---
                                        I am endeavoring, Madam, to construct a mnemonic memory circuit using stone knives and bearskins - Mr. Spock 1935 and me 2011

                                        1 Reply Last reply
                                        0
                                        • D Danny Martin

                                          Hi Guys, I am doing a bit of research and was just wondering... How many programmers know how a computer does math? We take it for granted that those beige boxes (or white, shiny ones in my case :o) know that 2 + 2 = 4, but how many devs know how they work it out? How many care? Should we know? If you know, how did you find out, and when / under what circumstances etc. I learned Boolean Logic in the nineties while working with 68k assembler, and it was a real eye opener. What are the teams thoughts? Danny

                                          K Offline
                                          K Offline
                                          KP Lee
                                          wrote on last edited by
                                          #88

                                          I believe around '78-'79. I started programming late in '74. It did get into precision and my matrix math class got into some non-"well formed" computer math problems (First time I could get near a computer. The community college I started at wouldn't let students near their computer, it was too valuable.) Hired by Boeing in '76. Cobol coding with IBM. First thing they did was send me to JCL class. Didn't learn anything I didn't know. Transfered to a new group in '78. They sent me to class because I would be working with CDC machines and FORTRAN. Learned to read dumps using octal output and how coding used memory and registers to actually do the math, minor changes in FORTRAN. Not too much about the architecture of the CDC, just that it used 60 bit words, so it worked well with octal. Around '80 I was about to become a tech rep. This was a different teaching experience. I had three, sometimes four teachers for two weeks. Each teacher would go 1 on 1 for 2 hours straight and then either a 4th teacher for a final 2 hours or two teachers taking on another hour each. I learned about 1's/2's comp, IBM's structure, more detail about the physical structure of the CDC and how data moved, ditto for the CRAY and IBM, in-depth instruction on the precision and workings of 60, 64, and 32 bit machines. Finite element analysis concepts and 3 of it's languages. At that time, defining different integer types wasn't used much. They made it very clear about the superior precision both CDC and the CRAY had over the joke in the engineering world - IBM. You defined an integer or real, it used the structure of the machine to define it. I think I learned more in those 2 weeks than the 2 years of what I thought was a brutal workload at WSU. Took some classes on my own in 2004. The binary instruction I got was a joke (never covered the sign bit) and the rest of the students couldn't care less about the little info that was given.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups