Higher Software Education
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
My guess is that computer science majors don't get a lot of the hardware and low-level fundamentals. They leave that for the knuckle-dragging computer engineers. (Wright State University, B.S. in Computer Engineering, class of '84; Go Raiders!)
Software Zen:
delete this;
Fold With Us![^] -
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
Did when I was in college almost twenty years ago. Though certainly not to the extent you describe. But today, kids think they can read a book and actually learn programming in 21 days. :sigh: Maybe some recent graduates from decent schools will respond. Joel has written about this[^] sort of thing too.
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
When I did my degree (back in the early 90's) they taught programming concepts and practices rather than any particular language. Languages we taught ourselves, and applied the concepts we had learned.
-------------------------------------------------------- Knowledge is knowing that the tomato is a fruit. Wisdom is not putting it in fruit salad!!
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
BS mid 80's, MS late 90's; * Write compiler, YACC and Bison * asm 360 * c, c++, FORTRAN, COBAL * boolean logic, FPGA programming * TCP/IP and like protocols Good backround for all but last position --- What I wish we had; * More statistics, SAS, R, and SUDAN Everyone comming out of college and employed where I work is kicking my butt in SAS. Trying to learn it as fast as possible.
MrPlankton
“If I had my choice I would kill every reporter in the world but I am sure we would be getting reports from hell before breakfast.” William Tecumseh Sherman -
My guess is that computer science majors don't get a lot of the hardware and low-level fundamentals. They leave that for the knuckle-dragging computer engineers. (Wright State University, B.S. in Computer Engineering, class of '84; Go Raiders!)
Software Zen:
delete this;
Fold With Us![^]Gary R. Wheeler wrote:
They leave that for the knuckle-dragging computer engineers.
Should I bow down to your Majesty, or do you want to get your head out of the clouds. :laugh: If you think about it, you can't exists with out us. Digital is not limited to Higher software. IE.. the Bios in your computer, the digital controls in your car, or better yet! Digital TV! Dean Ps. Where do you place the guys that are the technicians? They're the ones who fix the engineers screw-ups. ;P (What do you call the guys that fix your screw-up?) :doh:
modified on Tuesday, February 10, 2009 8:30 PM
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
I have a background similar to yours including the electronics experience and my first language was assembler. In my opinion it's a complete waste of time to teach fundamentals at the level you're talking about to developers. It's like teaching car mechanics how to smelt iron. Here is why I believe this: The field of computing has been nothing if not a non stop progression of greater and greater abstraction, the higher the level of abstraction the more productive and useful the technology. The amount, the sheer quantity of knowledge, that is required to effectively use computing technology has increased at every level of abstraction. Back in the bit fiddling days there wasn't a tiny fraction of the info you need to know now to use something like C# to build a complete database application. Back in the day you could very easily be an expert on hardware at the resistor and IC level and programming in assembly language. Nowadays that's just not practical or desirable.
"It's so simple to be wise. Just think of something stupid to say and then don't say it." -Sam Levenson
-
Gary R. Wheeler wrote:
They leave that for the knuckle-dragging computer engineers.
Should I bow down to your Majesty, or do you want to get your head out of the clouds. :laugh: If you think about it, you can't exists with out us. Digital is not limited to Higher software. IE.. the Bios in your computer, the digital controls in your car, or better yet! Digital TV! Dean Ps. Where do you place the guys that are the technicians? They're the ones who fix the engineers screw-ups. ;P (What do you call the guys that fix your screw-up?) :doh:
modified on Tuesday, February 10, 2009 8:30 PM
Gandalf7 wrote:
What do you call the guys that fix your screw-up?
Developers (I typically have to fix my own screw-ups.)
WE ARE DYSLEXIC OF BORG. Refutance is systile. Your a$$ will be laminated.
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
Gandalf7 wrote:
what a BLT is and how it works differently with the CPU than other
Mmmmmmmm. Bacon.
-
My guess is that computer science majors don't get a lot of the hardware and low-level fundamentals. They leave that for the knuckle-dragging computer engineers. (Wright State University, B.S. in Computer Engineering, class of '84; Go Raiders!)
Software Zen:
delete this;
Fold With Us![^]Gary R. Wheeler wrote:
knuckle-dragging
Lab coat- and taped glasses-wearing you mean?
-
I have a background similar to yours including the electronics experience and my first language was assembler. In my opinion it's a complete waste of time to teach fundamentals at the level you're talking about to developers. It's like teaching car mechanics how to smelt iron. Here is why I believe this: The field of computing has been nothing if not a non stop progression of greater and greater abstraction, the higher the level of abstraction the more productive and useful the technology. The amount, the sheer quantity of knowledge, that is required to effectively use computing technology has increased at every level of abstraction. Back in the bit fiddling days there wasn't a tiny fraction of the info you need to know now to use something like C# to build a complete database application. Back in the day you could very easily be an expert on hardware at the resistor and IC level and programming in assembly language. Nowadays that's just not practical or desirable.
"It's so simple to be wise. Just think of something stupid to say and then don't say it." -Sam Levenson
I agree. There's just too much information involved... Like the old "Trying to drink from a firehose" adage. I think it would be the rare person that could totally understand every aspect of computers/languages/development, etc. today. Personally, I think it would be physically impossible for one person to get their head around everything and totally understand it. IMHO :sigh:
WE ARE DYSLEXIC OF BORG. Refutance is systile. Your a$$ will be laminated.
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
I'd agree with John C. My background is similar to yours in some ways - an electronics engineer who really knows the hardware, and who has done a lot of programming in low level languages to directly manipulate the ones and zeroes (including digital and analog test systems). I think that knowledge is extremely valuable to a developer in many circumstances. But for general application development it probably isn't all that useful, and the great majority of developers will never need it. In fact, of the programmers I've worked with few would be capable of learning it. They lack the interest and technical background to understand it, and it requires both. Why should they? How many software developers really need to know about RAS/CAS refresh cycles, RS232 handshaking transition times and levels, TTL vs CMOS rise times, and such? They aren't taught it school, and don't need it at work. Software developers' gifts and interests lie in other areas for the most part. Very few ever need to get into the nuts and bolts like you do, and if they do (and they have the background to understand it) there are classes available that many employers will pay for. Some people love software, and should pursue that; others love hardware, and should study that. The courses offered by schools reflect that boundary, and it's possible to take a double major if a student is interested in both. I've known a few who did. By the way, I think you'll get a better grasp of modern programming concepts if you skip VB and focus on C++ or C#. OOP for VB was sort of an afterthought, and it shows. I'm not good at any of them, but I'm a hardware weenie. Back when I did a lot of programming the languages and concepts were simpler, and operating systems were easy to comprehend. I'm working at learning C# these days, but I'll never be more than a dabbler. It's fun, but it's not my career, and the hours available to master it are few. If you're serious about learning programming with modern high level languages, you're going to have to set that goal with a high priority and devote a lot of time to it. Be prepared to deal with huge frustration, not from the languages - they're not hard - but from the OS environment. It's a new world out there, a very busy and complex one. On the bright side, though, there's an immense sense of satisfaction when something you've been fighting to make work for weeks finally clicks. Have fun! :-D
"A Journey of a Thousand Rest Stops Begins with a Single Movement"
-
Gandalf7 wrote:
what a BLT is and how it works differently with the CPU than other
Mmmmmmmm. Bacon.
-
Oh, lettuce alone!:mad:
"A Journey of a Thousand Rest Stops Begins with a Single Movement"
-
This is my first post, so be gentle. My background is mostly on the hardware side of computers. Back in the day when Motherboards were expensive ($1.5 - 30k) in the early 80's (Not Apple Computers!), I made a living repairing them. One part of my job involved writing assembly programs that were used to diagnose individual components to locate the one having a problem. (Try finding a memory chip with one blown bit on a array of four boards or a stuck on bit in the logic circuitry.) So, I understand how computers work, what exactly happens when a handle (or pointer) is created, what a BLT is and how it works differently with the CPU than other programming operations, etc... Basically, the nuts and bolts. I've noticed that since I took up VB.Net and I am trying to wrap my mind around all the concepts that make up OOP, Polymorphism, Delegates, Reflection, etc.. that allot of the fundamentals of how a computer really works are never talked about. For instance, when a beginning programmer asks "BackColor = Color.Transparent only shows a black screen, Why?" The typical response is Silence, Microsoft doesn't support it or it doesn't work." I know why, do you? Just like fundamentals in baseball is necessary in win the World Series, I would think it would be necessary in programming. My question is: What do they teach in school about computer fundamentals? Have you guys that have been programmers for years ever thought about it?
Gandalf7 wrote:
What do they teach in school about computer fundamentals?
In high school they teach some of the basics - but not at any depth. A quick summary of memory, registers etc. but certainly no assembler. (This is from my experience teaching at three high schools, and looking at the curriculum requirements - though individual schools do have their own courses) At university it tends to depend upon which course you are taking - certainly software developers rarely look 'below' C as a language. As has been said elsewhere, the level of abstraction increases over time. I confess I learned machine code, by typing it in byte by byte (indeed my VERY first program counted from 1 to 255 and was entered a byte at a time by physically switching eight switches then pressing a big red button to enter that instruction) and I believe it gives me a good understanding of what's going on behind the smoke and mirrors - and can help me still.
Gandalf7 wrote:
Have you guys that have been programmers for years ever thought about it?
Yep - think about it all the time. I don't believe a depth of understanding is any linger necessary - although a certain level of understanding of the concepts can help. You don't need to know whether something is on the heap or the stack, for example, but it can be useful to understand the concepts of stack and heap when looking at parameter passing by reference or value.
___________________________________________ .\\axxx (That's an 'M')
-
Oh, lettuce alone!:mad:
"A Journey of a Thousand Rest Stops Begins with a Single Movement"
-
Gandalf7 wrote:
What do they teach in school about computer fundamentals?
In high school they teach some of the basics - but not at any depth. A quick summary of memory, registers etc. but certainly no assembler. (This is from my experience teaching at three high schools, and looking at the curriculum requirements - though individual schools do have their own courses) At university it tends to depend upon which course you are taking - certainly software developers rarely look 'below' C as a language. As has been said elsewhere, the level of abstraction increases over time. I confess I learned machine code, by typing it in byte by byte (indeed my VERY first program counted from 1 to 255 and was entered a byte at a time by physically switching eight switches then pressing a big red button to enter that instruction) and I believe it gives me a good understanding of what's going on behind the smoke and mirrors - and can help me still.
Gandalf7 wrote:
Have you guys that have been programmers for years ever thought about it?
Yep - think about it all the time. I don't believe a depth of understanding is any linger necessary - although a certain level of understanding of the concepts can help. You don't need to know whether something is on the heap or the stack, for example, but it can be useful to understand the concepts of stack and heap when looking at parameter passing by reference or value.
___________________________________________ .\\axxx (That's an 'M')
Maxxx_ wrote:
pressing a big red button to enter that instruction
You had a red button????:mad: Damn, all they gave me was a spring-loaded toggle switch. Well, two actually. One 'LOAD' the other 'NEXT'. It just isn't fair... :(
"A Journey of a Thousand Rest Stops Begins with a Single Movement"
-
Maxxx_ wrote:
pressing a big red button to enter that instruction
You had a red button????:mad: Damn, all they gave me was a spring-loaded toggle switch. Well, two actually. One 'LOAD' the other 'NEXT'. It just isn't fair... :(
"A Journey of a Thousand Rest Stops Begins with a Single Movement"
I bet he had one of them fancy 8 pixel displays too.
-
I bet he had one of them fancy 8 pixel displays too.
Some people had it way too easy. All I had was red LEDs, 16 of them, until I designed an I/O card to talk to a ASR33 teletype. I bet he never had a bit-bucket like mine! :-D
"A Journey of a Thousand Rest Stops Begins with a Single Movement"
-
Some people had it way too easy. All I had was red LEDs, 16 of them, until I designed an I/O card to talk to a ASR33 teletype. I bet he never had a bit-bucket like mine! :-D
"A Journey of a Thousand Rest Stops Begins with a Single Movement"
16!!!!!! I only had eight! And, when I say big red button, it wasn't that big. Or that red, really. But it was a button. We had to get up at six O'Clock in t'morning and lick t'road clean with 'tongue.
___________________________________________ .\\axxx (That's an 'M')