Higher Software Education
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
A lot of computer programmers are self trained, they vary in skills from your basic tyre fitter to a aeronautical engineer to use a analogy. Most computer science classes have some assembler, C, algorithms and datastructures modules. Without this knowledge I don't know how anyone can understand what their programs are really doing. Higher levels of abstraction can be useful, the physical computer architecture or virtual machine design seems a reasonable cut off point for most people. Most programmers certainly don't need to know PNP doping levels etc. I learnt some stuff in computer science classes, others in physics, electronics or maths, other stuff is self taught. I think its important to know this stuff if you work with computers. Many people I work with are the 'tyre-fitters', they basically put square pegs in square holes, they can use an IDE and a database, but they could not tell you how it was written, they could not write a compiler/assembler/virtual machine. Most people do not realise this distinction, they think they are a high flying programmers but really they are the factory workers of the 20th century and its no surprise their jobs get offshored. I have been programming 17 years, I have thought about it for years, I learnt 68000 and 8086 assembler back then, I have to agree that electronics and possibly even physics and math majors probably understand the fundamentals better than most programmers. I have worked with such people on embedded projects. Now I code entirely in high level languages, C++, Java, C#. Java and C# are very easy languages to learn, anyone who can manage to write anything in assembler should not struggle. One thing that has changed is the application domains, there is a lot of framework, library, architecture, pattern, OS, other tools stuff that while less fundamental is often required to produce a modern application in an acceptable timeframe.
modified on Wednesday, February 11, 2009 1:25 PM
-
16!!!!!! I only had eight! And, when I say big red button, it wasn't that big. Or that red, really. But it was a button. We had to get up at six O'Clock in t'morning and lick t'road clean with 'tongue.
___________________________________________ .\\axxx (That's an 'M')
Maxxx_ wrote:
16!!!!!! I only had eight! And, when I say big red button, it wasn't that big. Or that red, really. But it was a button.
We had it nice. 16 LEDs, plus one for OverFlow, one for Run/Halt. And our big red button was a small gray push-button. The big red one was E-Stop. What was sweet is the machine booted in one clock cycle - the time it took to sense the gray button was pushed. Agree that C and C++ should be the path to take for a hardware geek. You can show the assembly code to give you those warm fuzzies again. The programmers that work for me today have zero hardware experience. I have found over time that some developers (bad ones) have no concept how a computer even works, they just know how to translate some English into another language. But they also cannot optimize code (Back in the day, I had to know i++ and ++i took different amounts of memory so we could get the code to fit into RAM.) nor understand why building loops in different ways can affect the speed of the application.
Gary
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
I started programming before I got into college, so I also didn't understand the nuts and bolts of what I was doing. Programming books didn't teach that, but I kept on going. Just recently I was able to understand the nuts and bolts of what I was doing because I'm attending to Computer Engineering course in college. And I'm really happy to learn these things, I'm glad I chose Computer Engineering instead of Computer Science. And I agree with you, I think knowing the fundamentals is necessary in programming, at least if you are going to be a successful programmer. Fábio
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
A lot of it comes from "turn the key and go" mentality. <analogy> Driver education rarely teaches kids how a car works, just how to drive it. Then we end up with "It's making a funny noise from the thingy up front" I remember a 4 week Driver&Maintenance course I took in the Army. We had to diagnose everything and repair almost everything in the field. Imagine sticking your hand into the throttle linkage while straddling a running engine, to push the fuel cutoff (a student had pulled too hard on the cable and broke it). </analogy>
Cheetah. Ferret. Gonads. What more can I say? - Pete O'Hanlon
-
When I did my degree (back in the early 90's) they taught programming concepts and practices rather than any particular language. Languages we taught ourselves, and applied the concepts we had learned.
-------------------------------------------------------- Knowledge is knowing that the tomato is a fruit. Wisdom is not putting it in fruit salad!!
I thin this is important also, but I think teaching these concepts as part of a language is also a good idea. Many of the concepts can be difficult to cover without concrete examples. Teaching the examples within some context helps most folks understand things a bit better. I went through on class that attempted to teach software concepts using only pseudocode... The problem is that they were so stuck on the syntax used in the fake code that often times the idea of the 'programming concepts' themselves were lost.
-
I agree. There's just too much information involved... Like the old "Trying to drink from a firehose" adage. I think it would be the rare person that could totally understand every aspect of computers/languages/development, etc. today. Personally, I think it would be physically impossible for one person to get their head around everything and totally understand it. IMHO :sigh:
WE ARE DYSLEXIC OF BORG. Refutance is systile. Your a$$ will be laminated.
-
Member 2825662 wrote:
Worked on 15 bit word computers
Would they have been ICL jobs? They are the only 15 bit word computers I worked on, or indeed have knowledge of.
Henry Minute Do not read medical books! You could die of a misprint. - Mark Twain Girl: (staring) "Why do you need an icy cucumber?" “I want to report a fraud. The government is lying to us all.”
They were made by Univac, converted from some commercial mainframe they had at the time. They were 15 data bits plus one parity bit. We had 64 Kwords of magnetic core memory in 4 blocks. No keyboard or display, although we connected it to an IBM selectric typewriter that was modified with interface electronics. At least we had magnetic tape, not paper tape like some of the other computers on the boat.:cool:
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
I have a Computer Science degree from a liberal arts college, so I have some hardware background (but not even close to as I would get from an engineering program). I have a decent understanding of how computers work at the physical level. That has helped me more times than I can count when I'm up against a difficult problem, especially when it has come to optimizing code. Nowadays, most of the colleges in my area have gone to what they call a "Management Information Science" degree - basically, a combination of programming and business degrees. For the most part, students in these programs seem to learn how to throw together code but not why you do what you do. That turns out a lot of clueless people when it comes to actually doing the work. I agree with others who have said that the abstraction of modern languages makes some of that in-depth hardware knowledge unnecessary. However, you do need to have at least a basic knowledge of how the machines work to do this job effectively. Ed
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
personality thing really it is of my colleagues that i think are "the very best" -- all have very strong analytical and problem-solving type personalities -- and an unbounded amount of intellectual curiousity their educational backgrounds were not necessarily in computer science either,(although some eventually received their masters degrees in CS)... all had undergraduate degrees... in either (computer science majors not included, ranked in terms of numbers of people): electrical engineering physics and mathematics other engineering disciplines (like mechanical) like my sister and i always say: it's all about solving puzzles, you have like solving puzzles as for me: i like to fix things, find out how things work, and make things -- i suppose that's why i love being a software engineer so much. kind regards,
David
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
I've been out of university for a little less than three years. We didn't study a lot about hardware, but we did have an opportunity to build an 8-bit machine from logic gates. We touched on assembly briefly, but that was very limited. I'd say we got a brief introduction at best to low-level concepts. Not enough to be any sort of expert, but enough to generate interest in them, if one wanted to pursue them further.
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
My impression is that a BS degree in Computer Science generally (depending on the university) includes: 1. Introductory programming (Pascal or C in the 80s, C++ in the 90s, now Java or C#) 2. Data structures (Stacks, queues, lists, trees, graphs, hash tables, etc...) 3. Assembly language 4. Operating systems 5. "Hardware without electronics" (my term): Design of registers, multiplexers, etc, up to the design of a simple computer using gates (AND, OR) as primitives. No clue about how gates are made, or transistors. 6. Additional courses depending on the university, degree program, and the student's selections, such as programming languages, finite automata, AI, graphics, file structures, databases, numerical methods, etc.
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
I separate the programmers in my life and on teams I work on into two groups- those who know the fundamentals and those who don't. I find the guys who don't- mainly younger programmers who have graduated from college post 1997- have a very GUI-centered view of the universe. .NET in general has a gui-centered view of the universe. This can be really useful if you're focused on end results and the GUI. It's not terribly useful if your application is not scaling well, or you need to hand-optimize anything at all. The best team, has both types of programmers on it, the project is split into tiers, and the lower level a tier is, the more experience you need on it.
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
I'm 5 yrs into programming, just out of grad school (computational chemistry), and needed to take some CSCI courses. here's what I think should be mentioned. Yes for OOP learning the "Nuts & Bolts" isn't fully nessicary, at some point you may get a couple of "tricks" under your belt that exploit some of the base hardware structure but ... "you don't need to learn why just that if you do it in this order it works better." (as explqained to me by a phd in computer science) there is a heierarchy in who uses what (language) and by all accounts here it is (in acadamia): Assembly, fortran, java = Engineering Assembly, fortran, C, java= Physics, Math Assembly, fortran, C, C++ = Chemistry fortran, C, C++, java = Biology (psychology)
(java usually for students not for "real" projects == from my experience)
// nothing agains java and this is due to the ammount of control, percieved or otherwise, the the user actually needs over the optimization of the application they are working on, and the average size of the program (in this list it is my understanding that the programs get larger as they go down the list, more verbose and less sleek). However in the CSCI courses I attended, they don't make this distinction(for the most part) and suggest using the language that you know if it will get the job done. As is the sentimante I percieve from many people/posts on here. There are exceptions but largely even in physics and chemistry we are not so much worried about the hardware level of things, however when it becomes nessicary..(we have a number of compute cycles that take weeks to complete, for one data point) we take a look under the hood to see where we can tinker to get some extra cycles. typically this is more styalistic then archatecture based because we are running on a variety of workstations, (there is a cray I think in the math department!). So we are religated to using methods that work sufficiently well on a number of systems. this doesn't mean that when we were looking at "forcing" a calculation that we didn't look into hardware dependace, and process timing, however the return for the effort just wasn't there, our time would be much more wisely spent making yet another module/library that is "Good Enough" for most situations. overall I believe that if someone becomes "limited" by either what they know: For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical respon -
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
I graduated in 2003, from a highly rated computer science program. I was exposed to many of the fundamentals, I've read knuth and the whole bit. I have always wanted to believe that knowing the "nuts and bolts" makes you better and I still probably do. However, in my current experience (5+ years working) that it has not helped me and people from other backgrounds do just as well knowing only the latest C# web controls. It is an interesting question however, is there another field where if you know the last 5 years of technology and nothing else you might be better suited than someone who has been in the business for decades?
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
I just graduated in 2008 with a major in Computer and Software Systems - it's not a traditional Computer Science or Computer Engineering degree; it makes an attempt to blend the two together. I usually describe it as a Software Engineering degree. Prior to that, I received two Associate-level degrees in Computer Science and Engineering Technology. Ironically, my Engineering Technology degree required more computer courses than Computer Science. The AS-CS degree only required one computer course - "Level I" C++ Programming. My BS-CSS program was much more involved for obvious reasons. The only catch to my experience detailed below is that the university I attended was a small satellite campus (University of Washington-Bothell) and everything was on a small scale, including the classes - we didn't have lecture halls, for example, and most classes were considered "large" if you had 20-25 students enrolled; most classes were about 15 students large. So to answer your question, "what do they teach in school about computer fundamentals"... not much, especially at higher levels. There's no good place to grasp the mid-level concepts you mentioned. In the intro courses, that's wayyy too advanced (they usually factor to the lowest common denominator, so the final program in those courses is usually the 17th iteration of "Hello World"). In the higher courses, it's assumed you already know them BUT if you don't, chances are one of your classmates do and will give you a crash course, or the instructor/professor/TA will assist you. In the higher level courses, they teach you the approach to solving problems (e.g., algorithms), high-level theory (e.g., when calling SomeShape->Draw(), what does the video driver DO [mostly ignoring hardware] and why is it better than some other way?), but most of all, it's general practices on HOW to approach software as a whole (e.g., Software Development Lifecycles [SDLCs]). For all the programming/language-specific details, that's what Google and resources such as MSDN (or CodeProject! :P) are for. It seems counter-intuitive, but it makes sense to ignore specific language inquiries if you think about it. For those here who graduated their respective CS programs in the 80s and are still actively working in the industry, how many of you still use the same languages today you did back in college? In my program, most of our work was done in C++, despite .NET's prevalence in this area due to Redmond being right around the corner. The argument given was that C++ is versatile enough that it
-
I graduated in 2003, from a highly rated computer science program. I was exposed to many of the fundamentals, I've read knuth and the whole bit. I have always wanted to believe that knowing the "nuts and bolts" makes you better and I still probably do. However, in my current experience (5+ years working) that it has not helped me and people from other backgrounds do just as well knowing only the latest C# web controls. It is an interesting question however, is there another field where if you know the last 5 years of technology and nothing else you might be better suited than someone who has been in the business for decades?
Yes they can have a career making business apps on windows. Would they be useful on projects like :- A computer game. A compiler. A graphics program. An iphone app. An engine management system. A CAD program. A web framework. A database. A 3D engine. A distributed file system. A multithreaded web server. Computational Chemistry. Medical Imaging. etc, etc... They would be useless on a vast range of applications that don't involve Windows and drag and drop. Luckily for them 80% of the jobs are vanilla windows business apps.
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
I have to agree with Roger on learning VB.Net versus C# or some other OO Language. VB.NET is what I focused on for years because I originally learned the basics of VB 6 in college so I stayed that route via self learning over the years. Now, I'm trying to learn C# because that's what everyone seems to use. I've watched Code Project articles shift from high percents of VB topics to, what seems to be, 90% C# articles and 2% VB. Whenever I go searching for a code example on the web in a subject I need some help with, nearly every example I find is in C#. So, it looks like It's time to give up VB and go the other route since, apparently, the rest of the world already has. I wish I wouldn't have put so many months of my time into learning VB and instead focused on C# or C even. Oh well. Time to start all over again .
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
Course 6.031 Structure and Interpretation of Computer Languages was probably the course I learned the most from. Once you understand the hows and whys, programming in any particular language is mainly syntax.:cool:
-
16!!!!!! I only had eight! And, when I say big red button, it wasn't that big. Or that red, really. But it was a button. We had to get up at six O'Clock in t'morning and lick t'road clean with 'tongue.
___________________________________________ .\\axxx (That's an 'M')
Softie. We had to draw the wires ourselves, and then solder them together. When we needed another program, we ripped those wires out and put in new ones. Whippersnappers!
-
When I did my degree (back in the early 90's) they taught programming concepts and practices rather than any particular language. Languages we taught ourselves, and applied the concepts we had learned.
-------------------------------------------------------- Knowledge is knowing that the tomato is a fruit. Wisdom is not putting it in fruit salad!!
_Damian S_ wrote:
When I did my degree (back in the early 90's) they taught programming concepts and practices rather than any particular language. Languages we taught ourselves, and applied the concepts we had learned.
That's why most employers wonder why most new CS & SE graduates have never written an application that has more than a 1000 lines of code. :doh:
Why go back to the drawing board when you have a Tablet PC?