Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Higher Software Education

Higher Software Education

Scheduled Pinned Locked Moved The Lounge
helpquestioncsharphardware
59 Posts 39 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • D Dean Moe

    This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

    E Offline
    E Offline
    ely_bob
    wrote on last edited by
    #43

    I'm 5 yrs into programming, just out of grad school (computational chemistry), and needed to take some CSCI courses. here's what I think should be mentioned. Yes for OOP learning the "Nuts & Bolts" isn't fully nessicary, at some point you may get a couple of "tricks" under your belt that exploit some of the base hardware structure but ... "you don't need to learn why just that if you do it in this order it works better." (as explqained to me by a phd in computer science) there is a heierarchy in who uses what (language) and by all accounts here it is (in acadamia): Assembly, fortran, java = Engineering Assembly, fortran, C, java= Physics, Math Assembly, fortran, C, C++ = Chemistry fortran, C, C++, java = Biology (psychology) (java usually for students not for "real" projects == from my experience)// nothing agains java and this is due to the ammount of control, percieved or otherwise, the the user actually needs over the optimization of the application they are working on, and the average size of the program (in this list it is my understanding that the programs get larger as they go down the list, more verbose and less sleek). However in the CSCI courses I attended, they don't make this distinction(for the most part) and suggest using the language that you know if it will get the job done. As is the sentimante I percieve from many people/posts on here. There are exceptions but largely even in physics and chemistry we are not so much worried about the hardware level of things, however when it becomes nessicary..(we have a number of compute cycles that take weeks to complete, for one data point) we take a look under the hood to see where we can tinker to get some extra cycles. typically this is more styalistic then archatecture based because we are running on a variety of workstations, (there is a cray I think in the math department!). So we are religated to using methods that work sufficiently well on a number of systems. this doesn't mean that when we were looking at "forcing" a calculation that we didn't look into hardware dependace, and process timing, however the return for the effort just wasn't there, our time would be much more wisely spent making yet another module/library that is "Good Enough" for most situations. overall I believe that if someone becomes "limited" by either what they know: For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical respon

    1 Reply Last reply
    0
    • D Dean Moe

      This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

      T Offline
      T Offline
      theripevessel
      wrote on last edited by
      #44

      I graduated in 2003, from a highly rated computer science program. I was exposed to many of the fundamentals, I've read knuth and the whole bit. I have always wanted to believe that knowing the "nuts and bolts" makes you better and I still probably do. However, in my current experience (5+ years working) that it has not helped me and people from other backgrounds do just as well knowing only the latest C# web controls. It is an interesting question however, is there another field where if you know the last 5 years of technology and nothing else you might be better suited than someone who has been in the business for decades?

      L 1 Reply Last reply
      0
      • D Dean Moe

        This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

        H Offline
        H Offline
        Homncruse
        wrote on last edited by
        #45

        I just graduated in 2008 with a major in Computer and Software Systems - it's not a traditional Computer Science or Computer Engineering degree; it makes an attempt to blend the two together. I usually describe it as a Software Engineering degree. Prior to that, I received two Associate-level degrees in Computer Science and Engineering Technology. Ironically, my Engineering Technology degree required more computer courses than Computer Science. The AS-CS degree only required one computer course - "Level I" C++ Programming. My BS-CSS program was much more involved for obvious reasons. The only catch to my experience detailed below is that the university I attended was a small satellite campus (University of Washington-Bothell) and everything was on a small scale, including the classes - we didn't have lecture halls, for example, and most classes were considered "large" if you had 20-25 students enrolled; most classes were about 15 students large. So to answer your question, "what do they teach in school about computer fundamentals"... not much, especially at higher levels. There's no good place to grasp the mid-level concepts you mentioned. In the intro courses, that's wayyy too advanced (they usually factor to the lowest common denominator, so the final program in those courses is usually the 17th iteration of "Hello World"). In the higher courses, it's assumed you already know them BUT if you don't, chances are one of your classmates do and will give you a crash course, or the instructor/professor/TA will assist you. In the higher level courses, they teach you the approach to solving problems (e.g., algorithms), high-level theory (e.g., when calling SomeShape->Draw(), what does the video driver DO [mostly ignoring hardware] and why is it better than some other way?), but most of all, it's general practices on HOW to approach software as a whole (e.g., Software Development Lifecycles [SDLCs]). For all the programming/language-specific details, that's what Google and resources such as MSDN (or CodeProject! :P) are for. It seems counter-intuitive, but it makes sense to ignore specific language inquiries if you think about it. For those here who graduated their respective CS programs in the 80s and are still actively working in the industry, how many of you still use the same languages today you did back in college? In my program, most of our work was done in C++, despite .NET's prevalence in this area due to Redmond being right around the corner. The argument given was that C++ is versatile enough that it

        1 Reply Last reply
        0
        • T theripevessel

          I graduated in 2003, from a highly rated computer science program. I was exposed to many of the fundamentals, I've read knuth and the whole bit. I have always wanted to believe that knowing the "nuts and bolts" makes you better and I still probably do. However, in my current experience (5+ years working) that it has not helped me and people from other backgrounds do just as well knowing only the latest C# web controls. It is an interesting question however, is there another field where if you know the last 5 years of technology and nothing else you might be better suited than someone who has been in the business for decades?

          L Offline
          L Offline
          Lost User
          wrote on last edited by
          #46

          Yes they can have a career making business apps on windows. Would they be useful on projects like :- A computer game. A compiler. A graphics program. An iphone app. An engine management system. A CAD program. A web framework. A database. A 3D engine. A distributed file system. A multithreaded web server. Computational Chemistry. Medical Imaging. etc, etc... They would be useless on a vast range of applications that don't involve Windows and drag and drop. Luckily for them 80% of the jobs are vanilla windows business apps.

          1 Reply Last reply
          0
          • D Dean Moe

            This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

            J Offline
            J Offline
            jim norcal
            wrote on last edited by
            #47

            I have to agree with Roger on learning VB.Net versus C# or some other OO Language. VB.NET is what I focused on for years because I originally learned the basics of VB 6 in college so I stayed that route via self learning over the years. Now, I'm trying to learn C# because that's what everyone seems to use. I've watched Code Project articles shift from high percents of VB topics to, what seems to be, 90% C# articles and 2% VB. Whenever I go searching for a code example on the web in a subject I need some help with, nearly every example I find is in C#. So, it looks like It's time to give up VB and go the other route since, apparently, the rest of the world already has. I wish I wouldn't have put so many months of my time into learning VB and instead focused on C# or C even. Oh well. Time to start all over again .

            1 Reply Last reply
            0
            • D Dean Moe

              This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

              B Offline
              B Offline
              bVagadishnu
              wrote on last edited by
              #48

              Course 6.031 Structure and Interpretation of Computer Languages was probably the course I learned the most from. Once you understand the hows and whys, programming in any particular language is mainly syntax.:cool:

              1 Reply Last reply
              0
              • L Lost User

                16!!!!!! I only had eight! And, when I say big red button, it wasn't that big. Or that red, really. But it was a button. We had to get up at six O'Clock in t'morning and lick t'road clean with 'tongue.

                ___________________________________________ .\\axxx (That's an 'M')

                B Offline
                B Offline
                Brad Stiles
                wrote on last edited by
                #49

                Softie. We had to draw the wires ourselves, and then solder them together. When we needed another program, we ripped those wires out and put in new ones. Whippersnappers!

                1 Reply Last reply
                0
                • _ _Damian S_

                  When I did my degree (back in the early 90's) they taught programming concepts and practices rather than any particular language. Languages we taught ourselves, and applied the concepts we had learned.

                  -------------------------------------------------------- Knowledge is knowing that the tomato is a fruit. Wisdom is not putting it in fruit salad!!

                  J Offline
                  J Offline
                  jinksk
                  wrote on last edited by
                  #50

                  _Damian S_ wrote:

                  When I did my degree (back in the early 90's) they taught programming concepts and practices rather than any particular language. Languages we taught ourselves, and applied the concepts we had learned.

                  That's why most employers wonder why most new CS & SE graduates have never written an application that has more than a 1000 lines of code. :doh:

                  Why go back to the drawing board when you have a Tablet PC?

                  D _ 2 Replies Last reply
                  0
                  • J jinksk

                    _Damian S_ wrote:

                    When I did my degree (back in the early 90's) they taught programming concepts and practices rather than any particular language. Languages we taught ourselves, and applied the concepts we had learned.

                    That's why most employers wonder why most new CS & SE graduates have never written an application that has more than a 1000 lines of code. :doh:

                    Why go back to the drawing board when you have a Tablet PC?

                    D Offline
                    D Offline
                    Dan Neely
                    wrote on last edited by
                    #51

                    I wrote several apps in the 2-10k range before getting my degree. None were for a CS class though.

                    Today's lesson is brought to you by the word "niggardly". Remember kids, don't attribute to racism what can be explained by Scandinavian language roots. -- Robert Royall

                    1 Reply Last reply
                    0
                    • P PIEBALDconsult

                      Gary R. Wheeler wrote:

                      knuckle-dragging

                      Lab coat- and taped glasses-wearing you mean?

                      G Offline
                      G Offline
                      Gary R Wheeler
                      wrote on last edited by
                      #52

                      Nope; those are the computer 'scientists'. The knuckle-draggers show up with a hangover, a sunburn from a weekend of bike riding, and a major piss-off because the stupid lab coat type didn't plug in a network cable.

                      Software Zen: delete this;
                      Fold With Us![^]

                      P 1 Reply Last reply
                      0
                      • K ktm TechMan

                        Just finished B.SC in Computer science, two yrs back. Had classes in C, Assembly language, compiler design, JAVA, and even web programming. If one just sticks to web programming, no need for low level knowledge of computers. The courses were great but lousy teachers but I think that a low level knowledge is essential, when things go wrong in places where you least expected, you need that low level knowledge and find a workaround based on that.

                        D Offline
                        D Offline
                        Dean Moe
                        wrote on last edited by
                        #53

                        ktm TechMan wrote:

                        I think that a low level knowledge is essential, when things go wrong in places where you least expected, you need that low level knowledge and find a workaround based on that.

                        That's my point - it 's not necessary to know the nuts and bolts in detail, but to understand the foundation. Also, I hate workarounds! If you are going to fix it, fix it right. Microsoft should have done it years and years ago, instead of building on a house of cards.:mad:

                        1 Reply Last reply
                        0
                        • R Ray Cassick

                          I thin this is important also, but I think teaching these concepts as part of a language is also a good idea. Many of the concepts can be difficult to cover without concrete examples. Teaching the examples within some context helps most folks understand things a bit better. I went through on class that attempted to teach software concepts using only pseudocode... The problem is that they were so stuck on the syntax used in the fake code that often times the idea of the 'programming concepts' themselves were lost.


                          LinkedIn[^] | Blog[^] | Twitter[^]

                          _ Offline
                          _ Offline
                          _Damian S_
                          wrote on last edited by
                          #54

                          Yes, I should have specified a bit further - we did Pascal in first year and C in the other years... as far as commercial languages go, that's what we taught ourselves. ;-)

                          -------------------------------------------------------- Knowledge is knowing that the tomato is a fruit. Wisdom is not putting it in fruit salad!!

                          1 Reply Last reply
                          0
                          • J jinksk

                            _Damian S_ wrote:

                            When I did my degree (back in the early 90's) they taught programming concepts and practices rather than any particular language. Languages we taught ourselves, and applied the concepts we had learned.

                            That's why most employers wonder why most new CS & SE graduates have never written an application that has more than a 1000 lines of code. :doh:

                            Why go back to the drawing board when you have a Tablet PC?

                            _ Offline
                            _ Offline
                            _Damian S_
                            wrote on last edited by
                            #55

                            As per my comment in the thread above, I should have been a little more specific. At uni, we did Pascal and C. These aren't what I would call commercial languages in the business software field (the field I work in). We learned all the good things you need to know about designing good quality software, and implemented it all in C. Of course, these days I believe they use Java and some other stuff, my point is that the language you learn at uni isn't all that important - it's the underlying concepts that they teach you. Someone who knows a particular language inside out, but doesn't understand the basic concepts of what is a good design for software isn't as good (imho) as someone who can learn a new language reasonably quickly (or can find the answers they need) but has a great understanding of design/implementation concepts to make good quality software. No point writing a 5000 line unstructured mess full of goto's and multiple exits with repeated sections of code all over the place when a 1000 line nicely written piece of software will do nicely!!

                            -------------------------------------------------------- Knowledge is knowing that the tomato is a fruit. Wisdom is not putting it in fruit salad!!

                            J 1 Reply Last reply
                            0
                            • D Dean Moe

                              This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

                              S Offline
                              S Offline
                              sketch2002
                              wrote on last edited by
                              #56

                              From the replies it looks like I'm the youngest to reply and my experiences are quite different. My love of programming started between the third and fourth grades when I took a summer school class that had us using BASIC on Apple //e computers (which I took the next two summers as well because I enjoyed it so much). In high school we had some basics that I mostly sat through with glossed-over eyes (although this is when I learned of QuickBasic and began to lose my tan :-) ) and then eventually a VB class (which was a joke because I taught more than I learned - the teach as much as the other students). I went on to get an AAS in Computer Programming Technology and tested out of the basics classes. They had us use C++ and Java as well as Cobol and something else (sorry, getting fuzzy here, maybe a scripting language or something) on an AS 400 system. I think the one I can't remember was the one I enjoyed the most, but it may have been the Cobol. At any rate, back towards the intent of the topic, we hardly learn anything about the basics and if we're taught we're quite likely to ignore it and forget it as soon as we know it's not going to be on the next test. I wish I understood some of the more basic stuff, but as others have mentioned, it just isn't required. The languages now are too forgiving and computers are so fast and storage is so cheap that there is no reason to worry about your code being optimized for speed or size either one. I always wanted to learn assembler and binary and all of that (somehow I have always managed to prefer a keyboard to a mouse and a dos box to a GUI), but you could spend several lifetimes learning everything you could want to learn and ultimately you only need as much as your job requires, in my case that's VBA, VB.NET, SQL, ASP (classic still), PHP, HTML. And I was never formally taught any one of those, VBA comes the closest, but all of the rest was inferred off of my original BASIC or learned via the internet. It's a shame that I don't use any of the Java, C++, or Cobol that I actually have records to show I know.

                              1 Reply Last reply
                              0
                              • G Gary R Wheeler

                                Nope; those are the computer 'scientists'. The knuckle-draggers show up with a hangover, a sunburn from a weekend of bike riding, and a major piss-off because the stupid lab coat type didn't plug in a network cable.

                                Software Zen: delete this;
                                Fold With Us![^]

                                P Offline
                                P Offline
                                PIEBALDconsult
                                wrote on last edited by
                                #57

                                System Administrators you mean? They're not engineers.

                                1 Reply Last reply
                                0
                                • _ _Damian S_

                                  As per my comment in the thread above, I should have been a little more specific. At uni, we did Pascal and C. These aren't what I would call commercial languages in the business software field (the field I work in). We learned all the good things you need to know about designing good quality software, and implemented it all in C. Of course, these days I believe they use Java and some other stuff, my point is that the language you learn at uni isn't all that important - it's the underlying concepts that they teach you. Someone who knows a particular language inside out, but doesn't understand the basic concepts of what is a good design for software isn't as good (imho) as someone who can learn a new language reasonably quickly (or can find the answers they need) but has a great understanding of design/implementation concepts to make good quality software. No point writing a 5000 line unstructured mess full of goto's and multiple exits with repeated sections of code all over the place when a 1000 line nicely written piece of software will do nicely!!

                                  -------------------------------------------------------- Knowledge is knowing that the tomato is a fruit. Wisdom is not putting it in fruit salad!!

                                  J Offline
                                  J Offline
                                  James Lonero
                                  wrote on last edited by
                                  #58

                                  Damian, at one time C was the main commercial production development language (80s and early 90s). Even IBM used Pascal to write their MVS OS. But then, a great C programmer could write tighter code than a C++ compiler could create. (There was once a time when assembly was king.) Personally, I like Java, C#, and C++ much better than C or Pascal and they are more efficient and less bulky (in the code I write). It is nice to know the architecture of the hardware you are targeting, but the OS masks much of that from you. To know the limits of your language environment is more important and where your libraries can take you.

                                  1 Reply Last reply
                                  0
                                  • D Dean Moe

                                    This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?

                                    D Offline
                                    D Offline
                                    dybs
                                    wrote on last edited by
                                    #59

                                    I got my B.S. C.S. 12/07. Most of my courses were programming in C++ and ignored the hardware, except for the mandatory Computer Architecture and elective Microprocessors. They were essentially the same course, except in the first we learned x86 and the in second Motorola 68k assembler. They were actually 2 of my favorite (and most challenging) courses. We learned the basics of how a processor works (registers, pipelines, etc) and wrote some relatively simple assembly programs, but that was it. I now work at a small custom industrial electronics engineering company. I do all my programming in C++/CLI and C# now, but most of the my job involves writing utility programs to interface with the hardware the electrical engineers design. I work with the embedded programmers to know what the command format to send (typically over a COM port as raw hex data). So I don't worry too much about how the low-level stuff works, but for me it's useful so I can tell if any bugs are due to hardware, embedded software, or my utility software. Dybs

                                    1 Reply Last reply
                                    0
                                    Reply
                                    • Reply as topic
                                    Log in to reply
                                    • Oldest to Newest
                                    • Newest to Oldest
                                    • Most Votes


                                    • Login

                                    • Don't have an account? Register

                                    • Login or register to search.
                                    • First post
                                      Last post
                                    0
                                    • Categories
                                    • Recent
                                    • Tags
                                    • Popular
                                    • World
                                    • Users
                                    • Groups