Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Higher Software Education

Higher Software Education

Scheduled Pinned Locked Moved The Lounge
helpquestioncsharphardware
59 Posts 39 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • D Dean Moe

    This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

    U Offline
    U Offline
    User 2823383
    wrote on last edited by
    #27

    Started as an Electronics Technician on a submarine in the early 70's. Worked on 15 bit word computers, where the only programming was pressing buttons on the control panel (set only, another button to clear the word) to set the instructions in individual memory spaces. Did that to manipulate memory, registers and the CPU for troubleshooting. BS Math in the 80's, MS CS early 90's. Only programming language course I took was Fortran as a requirement for the math degree. When I was teaching CS, the fundamentals were Data Structures and Algorithms, languages were learned on the side. Now I'm not sure what is taught, but I think my hardware background helped me understand how and why computers worked so I could concentrate on programming to make them do what I want.

    H 1 Reply Last reply
    0
    • D Dean Moe

      This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

      P Offline
      P Offline
      Phil Martin
      wrote on last edited by
      #28

      Gandalf7 wrote:

      For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you?

      That is an interesting question to set the scene for your question - because the answer has very little to do with hardware. The whole sordid historical take might go back many years with some of the design decision being influenced by hardware - but most of it isn't that relevant any more in understanding why the result is the way it is. And to answer your questions:

      Gandalf7 wrote:

      What do they teach in school about computer fundamentals?

      In high school - I was exposed to very little (a decade ago now). The only electronics related parts where in physics, and that was only primitive parallel circuits with resistors. No transistors or anything beyond that. And computer fundamentals were just as rare for me. There was one class for computers, but it was considered a "bludge" class that noone took seriously.

      Gandalf7 wrote:

      Have you guys that have been programmers for years ever thought about it?

      Yes, many times. Quite often I see a question that to me is obvious, but only because of experience and exposure to other fields. It sure will be interesting talking to graduate programmers in a decade when for their entire life they've only resource management they have ever known about using garbage collection. Perhaps it will be really enlightening because their analytic minds will have been freed from the shackles of implementation details and allowed to think of amazing solutions to tough problems. Or not. ;P

      1 Reply Last reply
      0
      • U User 2823383

        Started as an Electronics Technician on a submarine in the early 70's. Worked on 15 bit word computers, where the only programming was pressing buttons on the control panel (set only, another button to clear the word) to set the instructions in individual memory spaces. Did that to manipulate memory, registers and the CPU for troubleshooting. BS Math in the 80's, MS CS early 90's. Only programming language course I took was Fortran as a requirement for the math degree. When I was teaching CS, the fundamentals were Data Structures and Algorithms, languages were learned on the side. Now I'm not sure what is taught, but I think my hardware background helped me understand how and why computers worked so I could concentrate on programming to make them do what I want.

        H Offline
        H Offline
        Henry Minute
        wrote on last edited by
        #29

        Member 2825662 wrote:

        Worked on 15 bit word computers

        Would they have been ICL jobs? They are the only 15 bit word computers I worked on, or indeed have knowledge of.

        Henry Minute Do not read medical books! You could die of a misprint. - Mark Twain Girl: (staring) "Why do you need an icy cucumber?" “I want to report a fraud. The government is lying to us all.”

        U 1 Reply Last reply
        0
        • M MrPlankton

          BS mid 80's, MS late 90's; * Write compiler, YACC and Bison * asm 360 * c, c++, FORTRAN, COBAL * boolean logic, FPGA programming * TCP/IP and like protocols Good backround for all but last position --- What I wish we had; * More statistics, SAS, R, and SUDAN Everyone comming out of college and employed where I work is kicking my butt in SAS. Trying to learn it as fast as possible.

          MrPlankton
          “If I had my choice I would kill every reporter in the world but I am sure we would be getting reports from hell before breakfast.” William Tecumseh Sherman

          G Offline
          G Offline
          GuyThiebaut
          wrote on last edited by
          #30

          MrPlankton wrote:

          Everyone comming out of college and employed where I work is kicking my butt in SAS. Trying to learn it as fast as possible.

          Same experience for me (BSc)- I am working in SAS and and struggling with not disliking it intensely - it's one of those programming languages where lots of 'stuff' has been bolted on with no regard to the overall paradigm of the language - grrr!:mad:

          Continuous effort - not strength or intelligence - is the key to unlocking our potential.(Winston Churchill)
          1 Reply Last reply
          0
          • D Dean Moe

            This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

            L Offline
            L Offline
            Lost User
            wrote on last edited by
            #31

            A lot of computer programmers are self trained, they vary in skills from your basic tyre fitter to a aeronautical engineer to use a analogy. Most computer science classes have some assembler, C, algorithms and datastructures modules. Without this knowledge I don't know how anyone can understand what their programs are really doing. Higher levels of abstraction can be useful, the physical computer architecture or virtual machine design seems a reasonable cut off point for most people. Most programmers certainly don't need to know PNP doping levels etc. I learnt some stuff in computer science classes, others in physics, electronics or maths, other stuff is self taught. I think its important to know this stuff if you work with computers. Many people I work with are the 'tyre-fitters', they basically put square pegs in square holes, they can use an IDE and a database, but they could not tell you how it was written, they could not write a compiler/assembler/virtual machine. Most people do not realise this distinction, they think they are a high flying programmers but really they are the factory workers of the 20th century and its no surprise their jobs get offshored. I have been programming 17 years, I have thought about it for years, I learnt 68000 and 8086 assembler back then, I have to agree that electronics and possibly even physics and math majors probably understand the fundamentals better than most programmers. I have worked with such people on embedded projects. Now I code entirely in high level languages, C++, Java, C#. Java and C# are very easy languages to learn, anyone who can manage to write anything in assembler should not struggle. One thing that has changed is the application domains, there is a lot of framework, library, architecture, pattern, OS, other tools stuff that while less fundamental is often required to produce a modern application in an acceptable timeframe.

            modified on Wednesday, February 11, 2009 1:25 PM

            1 Reply Last reply
            0
            • L Lost User

              16!!!!!! I only had eight! And, when I say big red button, it wasn't that big. Or that red, really. But it was a button. We had to get up at six O'Clock in t'morning and lick t'road clean with 'tongue.

              ___________________________________________ .\\axxx (That's an 'M')

              G Offline
              G Offline
              ghle
              wrote on last edited by
              #32

              Maxxx_ wrote:

              16!!!!!! I only had eight! And, when I say big red button, it wasn't that big. Or that red, really. But it was a button.

              We had it nice. 16 LEDs, plus one for OverFlow, one for Run/Halt. And our big red button was a small gray push-button. The big red one was E-Stop. What was sweet is the machine booted in one clock cycle - the time it took to sense the gray button was pushed. Agree that C and C++ should be the path to take for a hardware geek. You can show the assembly code to give you those warm fuzzies again. The programmers that work for me today have zero hardware experience. I have found over time that some developers (bad ones) have no concept how a computer even works, they just know how to translate some English into another language. But they also cannot optimize code (Back in the day, I had to know i++ and ++i took different amounts of memory so we could get the code to fit into RAM.) nor understand why building loops in different ways can affect the speed of the application.

              Gary

              1 Reply Last reply
              0
              • D Dean Moe

                This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

                F Offline
                F Offline
                Fabio Franco
                wrote on last edited by
                #33

                I started programming before I got into college, so I also didn't understand the nuts and bolts of what I was doing. Programming books didn't teach that, but I kept on going. Just recently I was able to understand the nuts and bolts of what I was doing because I'm attending to Computer Engineering course in college. And I'm really happy to learn these things, I'm glad I chose Computer Engineering instead of Computer Science. And I agree with you, I think knowing the fundamentals is necessary in programming, at least if you are going to be a successful programmer. Fábio

                1 Reply Last reply
                0
                • D Dean Moe

                  This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

                  R Offline
                  R Offline
                  Richard Jones
                  wrote on last edited by
                  #34

                  A lot of it comes from "turn the key and go" mentality. <analogy> Driver education rarely teaches kids how a car works, just how to drive it. Then we end up with "It's making a funny noise from the thingy up front" I remember a 4 week Driver&Maintenance course I took in the Army. We had to diagnose everything and repair almost everything in the field. Imagine sticking your hand into the throttle linkage while straddling a running engine, to push the fuel cutoff (a student had pulled too hard on the cable and broke it). </analogy>

                  Cheetah. Ferret. Gonads. What more can I say? - Pete O'Hanlon

                  1 Reply Last reply
                  0
                  • _ _Damian S_

                    When I did my degree (back in the early 90's) they taught programming concepts and practices rather than any particular language. Languages we taught ourselves, and applied the concepts we had learned.

                    -------------------------------------------------------- Knowledge is knowing that the tomato is a fruit. Wisdom is not putting it in fruit salad!!

                    R Offline
                    R Offline
                    Ray Cassick
                    wrote on last edited by
                    #35

                    I thin this is important also, but I think teaching these concepts as part of a language is also a good idea. Many of the concepts can be difficult to cover without concrete examples. Teaching the examples within some context helps most folks understand things a bit better. I went through on class that attempted to teach software concepts using only pseudocode... The problem is that they were so stuck on the syntax used in the fake code that often times the idea of the 'programming concepts' themselves were lost.


                    LinkedIn[^] | Blog[^] | Twitter[^]

                    _ 1 Reply Last reply
                    0
                    • T Tom Delany

                      I agree. There's just too much information involved... Like the old "Trying to drink from a firehose" adage. I think it would be the rare person that could totally understand every aspect of computers/languages/development, etc. today. Personally, I think it would be physically impossible for one person to get their head around everything and totally understand it. IMHO :sigh:

                      WE ARE DYSLEXIC OF BORG. Refutance is systile. Your a$$ will be laminated.

                      W Offline
                      W Offline
                      werD
                      wrote on last edited by
                      #36

                      Agreed, I usually say it like this when someone says something about me knowing "all " about computers. "The day I know it all is the day I get left behind."

                      DrewG, MCSD .Net

                      1 Reply Last reply
                      0
                      • H Henry Minute

                        Member 2825662 wrote:

                        Worked on 15 bit word computers

                        Would they have been ICL jobs? They are the only 15 bit word computers I worked on, or indeed have knowledge of.

                        Henry Minute Do not read medical books! You could die of a misprint. - Mark Twain Girl: (staring) "Why do you need an icy cucumber?" “I want to report a fraud. The government is lying to us all.”

                        U Offline
                        U Offline
                        User 2823383
                        wrote on last edited by
                        #37

                        They were made by Univac, converted from some commercial mainframe they had at the time. They were 15 data bits plus one parity bit. We had 64 Kwords of magnetic core memory in 4 blocks. No keyboard or display, although we connected it to an IBM selectric typewriter that was modified with interface electronics. At least we had magnetic tape, not paper tape like some of the other computers on the boat.:cool:

                        1 Reply Last reply
                        0
                        • D Dean Moe

                          This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

                          E Offline
                          E Offline
                          Ed Leighton Dick
                          wrote on last edited by
                          #38

                          I have a Computer Science degree from a liberal arts college, so I have some hardware background (but not even close to as I would get from an engineering program). I have a decent understanding of how computers work at the physical level. That has helped me more times than I can count when I'm up against a difficult problem, especially when it has come to optimizing code. Nowadays, most of the colleges in my area have gone to what they call a "Management Information Science" degree - basically, a combination of programming and business degrees. For the most part, students in these programs seem to learn how to throw together code but not why you do what you do. That turns out a lot of clueless people when it comes to actually doing the work. I agree with others who have said that the abstraction of modern languages makes some of that in-depth hardware knowledge unnecessary. However, you do need to have at least a basic knowledge of how the machines work to do this job effectively. Ed

                          1 Reply Last reply
                          0
                          • D Dean Moe

                            This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

                            E Offline
                            E Offline
                            etkid84
                            wrote on last edited by
                            #39

                            personality thing really it is of my colleagues that i think are "the very best" -- all have very strong analytical and problem-solving type personalities -- and an unbounded amount of intellectual curiousity their educational backgrounds were not necessarily in computer science either,(although some eventually received their masters degrees in CS)... all had undergraduate degrees... in either (computer science majors not included, ranked in terms of numbers of people): electrical engineering physics and mathematics other engineering disciplines (like mechanical) like my sister and i always say: it's all about solving puzzles, you have like solving puzzles as for me: i like to fix things, find out how things work, and make things -- i suppose that's why i love being a software engineer so much. kind regards,

                            David

                            1 Reply Last reply
                            0
                            • D Dean Moe

                              This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

                              M Offline
                              M Offline
                              Matt Totten
                              wrote on last edited by
                              #40

                              I've been out of university for a little less than three years. We didn't study a lot about hardware, but we did have an opportunity to build an 8-bit machine from logic gates. We touched on assembly briefly, but that was very limited. I'd say we got a brief introduction at best to low-level concepts. Not enough to be any sort of expert, but enough to generate interest in them, if one wanted to pursue them further.

                              1 Reply Last reply
                              0
                              • D Dean Moe

                                This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

                                A Offline
                                A Offline
                                Alan Balkany
                                wrote on last edited by
                                #41

                                My impression is that a BS degree in Computer Science generally (depending on the university) includes: 1. Introductory programming (Pascal or C in the 80s, C++ in the 90s, now Java or C#) 2. Data structures (Stacks, queues, lists, trees, graphs, hash tables, etc...) 3. Assembly language 4. Operating systems 5. "Hardware without electronics" (my term): Design of registers, multiplexers, etc, up to the design of a simple computer using gates (AND, OR) as primitives. No clue about how gates are made, or transistors. 6. Additional courses depending on the university, degree program, and the student's selections, such as programming languages, finite automata, AI, graphics, file structures, databases, numerical methods, etc.

                                1 Reply Last reply
                                0
                                • D Dean Moe

                                  This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

                                  T Offline
                                  T Offline
                                  Theodore M Seeber
                                  wrote on last edited by
                                  #42

                                  I separate the programmers in my life and on teams I work on into two groups- those who know the fundamentals and those who don't. I find the guys who don't- mainly younger programmers who have graduated from college post 1997- have a very GUI-centered view of the universe. .NET in general has a gui-centered view of the universe. This can be really useful if you're focused on end results and the GUI. It's not terribly useful if your application is not scaling well, or you need to hand-optimize anything at all. The best team, has both types of programmers on it, the project is split into tiers, and the lower level a tier is, the more experience you need on it.

                                  1 Reply Last reply
                                  0
                                  • D Dean Moe

                                    This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

                                    E Offline
                                    E Offline
                                    ely_bob
                                    wrote on last edited by
                                    #43

                                    I'm 5 yrs into programming, just out of grad school (computational chemistry), and needed to take some CSCI courses. here's what I think should be mentioned. Yes for OOP learning the "Nuts & Bolts" isn't fully nessicary, at some point you may get a couple of "tricks" under your belt that exploit some of the base hardware structure but ... "you don't need to learn why just that if you do it in this order it works better." (as explqained to me by a phd in computer science) there is a heierarchy in who uses what (language) and by all accounts here it is (in acadamia): Assembly, fortran, java = Engineering Assembly, fortran, C, java= Physics, Math Assembly, fortran, C, C++ = Chemistry fortran, C, C++, java = Biology (psychology) (java usually for students not for "real" projects == from my experience)// nothing agains java and this is due to the ammount of control, percieved or otherwise, the the user actually needs over the optimization of the application they are working on, and the average size of the program (in this list it is my understanding that the programs get larger as they go down the list, more verbose and less sleek). However in the CSCI courses I attended, they don't make this distinction(for the most part) and suggest using the language that you know if it will get the job done. As is the sentimante I percieve from many people/posts on here. There are exceptions but largely even in physics and chemistry we are not so much worried about the hardware level of things, however when it becomes nessicary..(we have a number of compute cycles that take weeks to complete, for one data point) we take a look under the hood to see where we can tinker to get some extra cycles. typically this is more styalistic then archatecture based because we are running on a variety of workstations, (there is a cray I think in the math department!). So we are religated to using methods that work sufficiently well on a number of systems. this doesn't mean that when we were looking at "forcing" a calculation that we didn't look into hardware dependace, and process timing, however the return for the effort just wasn't there, our time would be much more wisely spent making yet another module/library that is "Good Enough" for most situations. overall I believe that if someone becomes "limited" by either what they know: For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical respon

                                    1 Reply Last reply
                                    0
                                    • D Dean Moe

                                      This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

                                      T Offline
                                      T Offline
                                      theripevessel
                                      wrote on last edited by
                                      #44

                                      I graduated in 2003, from a highly rated computer science program. I was exposed to many of the fundamentals, I've read knuth and the whole bit. I have always wanted to believe that knowing the "nuts and bolts" makes you better and I still probably do. However, in my current experience (5+ years working) that it has not helped me and people from other backgrounds do just as well knowing only the latest C# web controls. It is an interesting question however, is there another field where if you know the last 5 years of technology and nothing else you might be better suited than someone who has been in the business for decades?

                                      L 1 Reply Last reply
                                      0
                                      • D Dean Moe

                                        This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

                                        H Offline
                                        H Offline
                                        Homncruse
                                        wrote on last edited by
                                        #45

                                        I just graduated in 2008 with a major in Computer and Software Systems - it's not a traditional Computer Science or Computer Engineering degree; it makes an attempt to blend the two together. I usually describe it as a Software Engineering degree. Prior to that, I received two Associate-level degrees in Computer Science and Engineering Technology. Ironically, my Engineering Technology degree required more computer courses than Computer Science. The AS-CS degree only required one computer course - "Level I" C++ Programming. My BS-CSS program was much more involved for obvious reasons. The only catch to my experience detailed below is that the university I attended was a small satellite campus (University of Washington-Bothell) and everything was on a small scale, including the classes - we didn't have lecture halls, for example, and most classes were considered "large" if you had 20-25 students enrolled; most classes were about 15 students large. So to answer your question, "what do they teach in school about computer fundamentals"... not much, especially at higher levels. There's no good place to grasp the mid-level concepts you mentioned. In the intro courses, that's wayyy too advanced (they usually factor to the lowest common denominator, so the final program in those courses is usually the 17th iteration of "Hello World"). In the higher courses, it's assumed you already know them BUT if you don't, chances are one of your classmates do and will give you a crash course, or the instructor/professor/TA will assist you. In the higher level courses, they teach you the approach to solving problems (e.g., algorithms), high-level theory (e.g., when calling SomeShape->Draw(), what does the video driver DO [mostly ignoring hardware] and why is it better than some other way?), but most of all, it's general practices on HOW to approach software as a whole (e.g., Software Development Lifecycles [SDLCs]). For all the programming/language-specific details, that's what Google and resources such as MSDN (or CodeProject! :P) are for. It seems counter-intuitive, but it makes sense to ignore specific language inquiries if you think about it. For those here who graduated their respective CS programs in the 80s and are still actively working in the industry, how many of you still use the same languages today you did back in college? In my program, most of our work was done in C++, despite .NET's prevalence in this area due to Redmond being right around the corner. The argument given was that C++ is versatile enough that it

                                        1 Reply Last reply
                                        0
                                        • T theripevessel

                                          I graduated in 2003, from a highly rated computer science program. I was exposed to many of the fundamentals, I've read knuth and the whole bit. I have always wanted to believe that knowing the "nuts and bolts" makes you better and I still probably do. However, in my current experience (5+ years working) that it has not helped me and people from other backgrounds do just as well knowing only the latest C# web controls. It is an interesting question however, is there another field where if you know the last 5 years of technology and nothing else you might be better suited than someone who has been in the business for decades?

                                          L Offline
                                          L Offline
                                          Lost User
                                          wrote on last edited by
                                          #46

                                          Yes they can have a career making business apps on windows. Would they be useful on projects like :- A computer game. A compiler. A graphics program. An iphone app. An engine management system. A CAD program. A web framework. A database. A 3D engine. A distributed file system. A multithreaded web server. Computational Chemistry. Medical Imaging. etc, etc... They would be useless on a vast range of applications that don't involve Windows and drag and drop. Luckily for them 80% of the jobs are vanilla windows business apps.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups