Programming in the 60s vs today...
-
Some of my coworkers are in their 60s and can debug any problem like it's nobody's business, because they learned low-level skills that have followed them throughout their entire careers. They have inner-working understanding the n00bs can only dream of. These days there's too many people in this field who'd have to resort to calling their IT support department because you disconnected their keyboard while they were away at lunch time. The framework, library, or language of the day they were experts at 3 years ago is useless today, and their skillset simply can't be adapted to new environments/situations. Those who are worth keeping around in the long term are few and far in-between--that's why there's so many job-hoppers.
Debugging and testing are the most valuable skills, and they're seldom taught.
-
To digress a little.... Time was an intelligent and educated person could know just about everything there was to know. Literally. And from that grew the stereotype of the lone scientist in his lab coming up with some new invention to change the world... for a while, such people could exist, but not any longer. No one can know everything, not even within one subject area - the most anyone can be is a master at one or two (r more, maybe) disciplines within a subject, there is that much knowledge about. So science now, and in the future, is and will be a collaborative affair. The big advances now - take nuclear fusion (if it ever happens), quantum computing, or a myriad of medical advances - these aren't and won't be made by our stereotypical white-coated lone scientist in a lab, but by the collaborative efforts of different research groups around the world. We all have to stand now on the giant collective shoulder of those around us in order to see anything.
Makes me think of James Burke's Connections series. The path to any discovery is usually weird, and builds on what came before.
-
You had to, didn't you, he's gonna choose javascript.
Wrong is evil and must be defeated. - Jeff Ello
-
Not that I was alive in the 60s, but when it came to learning technology in the olden days it was more like this... you learn X, Y, and Z. Master them. You're a programmer. These days it's more like learn A, B, C, D, E, F, G, H, I, J, K, L, M, N, O, P, Q, R, S, T, U, V, W, X, and Y. You have to know them all. You're *supposed* to master them all. And you can use all of them for decades, but as soon as you don't know Z... you're a n00b! How dare you not know something. We want someone who's used Z forget A through Y... Z baby all the way! What... you want to spend time with family these days? Freak! Go home and study until you die... get that Z too. Although as soon as you do we're switching to AA. Experienced people know that to master everything these days is impossible. But gee golly that Z is so shiny. Who cares if it's a 90% copy of Y... Z is so shiny. Welcome to the future. :~
Jeremy Falcon
Yup. I believe the latest buzzword today is "full stack" developer. Sorry, I don't buy that designation AT ALL. You could re-brand that "jack of all trades, master of none". I'm sorry, but the designation is pure B.S. I've been developing code for 40 years and I think I've developed some good proficiency in that time and know a few good technologies to use in my development. My code gets answers and it runs FAST. (I've had more than one employer ask me in an incredulous tone "why does your stuff run so fast"?). Er, maybe it's because I don't haul in a couple of gigs of library code to run my executables... I don't even apply for positions that are looking for "full stack" developers because, IMHO, they are completely disillusioned as to what software development is really about. I believe that would be ... solving problems? Full stack ... seriously?
If you think hiring a professional is expensive, wait until you hire an amateur! - Red Adair
-
Not that I was alive in the 60s, but when it came to learning technology in the olden days it was more like this... you learn X, Y, and Z. Master them. You're a programmer. These days it's more like learn A, B, C, D, E, F, G, H, I, J, K, L, M, N, O, P, Q, R, S, T, U, V, W, X, and Y. You have to know them all. You're *supposed* to master them all. And you can use all of them for decades, but as soon as you don't know Z... you're a n00b! How dare you not know something. We want someone who's used Z forget A through Y... Z baby all the way! What... you want to spend time with family these days? Freak! Go home and study until you die... get that Z too. Although as soon as you do we're switching to AA. Experienced people know that to master everything these days is impossible. But gee golly that Z is so shiny. Who cares if it's a 90% copy of Y... Z is so shiny. Welcome to the future. :~
Jeremy Falcon
Back in the day, the progression was: Jr. Programmer Programmer Programmer/Analyst Programmer/Analyst II Sr. Programmer/Analyst Sr. Analyst (Or Business Analyst) I always preferred adding Analyst. First you learn the syntax, and the environment. As a Jr. Programmer, you often took someones scribbles of code on punch cards, and punched them. The person reviewed them. One programmer could keep a few Jr. Programmers busy. (things changed). Usually it was teams of both... After the language/syntax and environment was learned. You moved up. The real interplay is in taking business needs and getting to computer solutions. = My favorite job interview was where I was competing with someone with 5 years of Clipper for a Clipper job. I had SEEN clipper code, and did DBase code a little bit. But I had great analytical skills. The guy interviewing me for a part-time position was convinced he would hire the "Pro", and not me, but already had my interview scheduled. I simply explained that it is the Analysis where all the failures being. The syntax of the language is easy enough to learn. If you are solving the right problem. I asked him to think about the "fixes" he had to have the previous guy make. What percentage were: - Did not understand the goal properly - Logic Error (Did not express the goal properly) - Lack of testing - Lack of User Sign off - User Error/User Confusion - Bad Syntax/Failure to use the programming language correctly? I explained to him, that if he hired me, I would drive the first few items to ZERO occurrences, and that my biggest fear was programming myself out of a job, because the current guy was constantly fixing his own mistakes. He laughed. He thought... He Hired... One year later, he apologized that he ran out of work for me to do. Wrote me a 2 page letter of recommendation, and gave me a minimum number of hours each week to do whatever I wanted. I want to hire creative problem solvers who know how to solve problems and express them in code. Then the importance of the language is reduced. The rework is reduced. Nobody wants to help that person by giving them a little time to learn a technology they may need. That's crazy. Good problem solvers are hard to find. Great programmer/analysts are hard to find. So old companies would make them!
-
Not that I was alive in the 60s, but when it came to learning technology in the olden days it was more like this... you learn X, Y, and Z. Master them. You're a programmer. These days it's more like learn A, B, C, D, E, F, G, H, I, J, K, L, M, N, O, P, Q, R, S, T, U, V, W, X, and Y. You have to know them all. You're *supposed* to master them all. And you can use all of them for decades, but as soon as you don't know Z... you're a n00b! How dare you not know something. We want someone who's used Z forget A through Y... Z baby all the way! What... you want to spend time with family these days? Freak! Go home and study until you die... get that Z too. Although as soon as you do we're switching to AA. Experienced people know that to master everything these days is impossible. But gee golly that Z is so shiny. Who cares if it's a 90% copy of Y... Z is so shiny. Welcome to the future. :~
Jeremy Falcon
-
Not that I was alive in the 60s, but when it came to learning technology in the olden days it was more like this... you learn X, Y, and Z. Master them. You're a programmer. These days it's more like learn A, B, C, D, E, F, G, H, I, J, K, L, M, N, O, P, Q, R, S, T, U, V, W, X, and Y. You have to know them all. You're *supposed* to master them all. And you can use all of them for decades, but as soon as you don't know Z... you're a n00b! How dare you not know something. We want someone who's used Z forget A through Y... Z baby all the way! What... you want to spend time with family these days? Freak! Go home and study until you die... get that Z too. Although as soon as you do we're switching to AA. Experienced people know that to master everything these days is impossible. But gee golly that Z is so shiny. Who cares if it's a 90% copy of Y... Z is so shiny. Welcome to the future. :~
Jeremy Falcon
Heck, even into the early '90s you could "get by" with just a few good skills. I think retraining hell is companies' revenge for having to pay us so well. I have a Despair Inc coffee mug that says "Just because you're necessary, doesn't mean you're important." That sums it up nicely.
-
Not that I was alive in the 60s, but when it came to learning technology in the olden days it was more like this... you learn X, Y, and Z. Master them. You're a programmer. These days it's more like learn A, B, C, D, E, F, G, H, I, J, K, L, M, N, O, P, Q, R, S, T, U, V, W, X, and Y. You have to know them all. You're *supposed* to master them all. And you can use all of them for decades, but as soon as you don't know Z... you're a n00b! How dare you not know something. We want someone who's used Z forget A through Y... Z baby all the way! What... you want to spend time with family these days? Freak! Go home and study until you die... get that Z too. Although as soon as you do we're switching to AA. Experienced people know that to master everything these days is impossible. But gee golly that Z is so shiny. Who cares if it's a 90% copy of Y... Z is so shiny. Welcome to the future. :~
Jeremy Falcon
Jeremy Falcon wrote:
Who cares if it's a 90% copy of Y... Z is so shin
Unfortunately the reality is that no one can figure out if new idioms are worthwhile until a lot of people use them. In the 60 there were no options. Not to mention that programmers had to wear suits. Not to mention you can still get a job programming Cobol if you want to.
-
Jeremy Falcon wrote:
you learn X, Y, and Z.
I wasn't there either, but if I understand correctly, you didn't learn all three. You picked your career path and then learned COBOL or FORTRAN or ASSEMBLY. Or, you learned Pascal and BASIC and hoped to get a job teaching.
Or Algol. There's a good chance if you were a programmer in the 60s you'd be exposed to Algol.
-
Not that I was alive in the 60s, but when it came to learning technology in the olden days it was more like this... you learn X, Y, and Z. Master them. You're a programmer. These days it's more like learn A, B, C, D, E, F, G, H, I, J, K, L, M, N, O, P, Q, R, S, T, U, V, W, X, and Y. You have to know them all. You're *supposed* to master them all. And you can use all of them for decades, but as soon as you don't know Z... you're a n00b! How dare you not know something. We want someone who's used Z forget A through Y... Z baby all the way! What... you want to spend time with family these days? Freak! Go home and study until you die... get that Z too. Although as soon as you do we're switching to AA. Experienced people know that to master everything these days is impossible. But gee golly that Z is so shiny. Who cares if it's a 90% copy of Y... Z is so shiny. Welcome to the future. :~
Jeremy Falcon
I agree with you that there is a lot more to know now than in the 60s. More importantly, I think, things change a lot faster now. But to be fair, there's a lot they had to know back then that most of us don't have to think about at all anymore. In particular, we don't usually need to think nearly as carefully about hardware issues (memory constraints, timing issues) or lower-level software issues (how to write a quicksort algorithm or a garbage collector). We don't need to cram 8 different boolean values into a single byte that we xor to read the value from. We don't need to write code that modifies itself or overlays itself to save memory. And we don't have to wait fifteen minutes or more for an edit/compile/run cycle.
-
Some of my coworkers are in their 60s and can debug any problem like it's nobody's business, because they learned low-level skills that have followed them throughout their entire careers. They have inner-working understanding the n00bs can only dream of. These days there's too many people in this field who'd have to resort to calling their IT support department because you disconnected their keyboard while they were away at lunch time. The framework, library, or language of the day they were experts at 3 years ago is useless today, and their skillset simply can't be adapted to new environments/situations. Those who are worth keeping around in the long term are few and far in-between--that's why there's so many job-hoppers.
dandy72 wrote:
These days there's too many people in this field who'd have to resort to calling their IT support department because you disconnected their keyboard while they were away at lunch time.
With a vast array of desirable business technology needs people specialize. Just as long ago the person that built a log cabin could dig the outhouse latrine but today I do not expect the cable guy to fix my toilet.
-
dandy72 wrote:
These days there's too many people in this field who'd have to resort to calling their IT support department because you disconnected their keyboard while they were away at lunch time.
With a vast array of desirable business technology needs people specialize. Just as long ago the person that built a log cabin could dig the outhouse latrine but today I do not expect the cable guy to fix my toilet.
While I agree with your assertion in the general sense, are you saying it's ok for people to never try to do anything, ever, that deviates from the only script they've learned to follow? If that's the case, then the automation revolution can't get here fast enough, because clearly nothing of value will be lost.
-
I hear you. Access is one of the best RAD tools around. Nothing beats it for one-off projects and I use it as a friendlier UI for SQL Server than SSMS. E.g., it is a breeze to link databases from different servers compared to the contorted SSMS procedure. Bud Staniek
-
I do half my work in Fortran and half in C#. So, on average, it feels like the 80's to me. :laugh: I feel blessed to be avoiding the new stuff. I do suffer vicariously through all of you CPers though.
Wait. I'm not the only person using Fortran on the forums? BLASPHEMER!
-
Or Algol. There's a good chance if you were a programmer in the 60s you'd be exposed to Algol.
You say that like it is some form of harmful radiation. I like. ;)
-
Way back when, I was a Professor of Computer Science (mid-eighties) and I thought I might know as much as 85% of what there was to know about computers and software - and I was upset about not knowing the other 15%. Nowadays I think I know about 0.0085% of what there is and falling behind about 0.001% per week - and am happy not knowing all the rest!
- I would love to change the world, but they won’t give me the source code.
Wise choice bro. I might have gone insane if I forced myself to learn all these AngularJs, BackboneJs, EmberJs, WEb Toolkit, jQuery, MooTools, React, OpenUI5, Smart Client, UnifiedJs, VueJs, and Webix.
-
Wait. I'm not the only person using Fortran on the forums? BLASPHEMER!
-
While I agree with your assertion in the general sense, are you saying it's ok for people to never try to do anything, ever, that deviates from the only script they've learned to follow? If that's the case, then the automation revolution can't get here fast enough, because clearly nothing of value will be lost.