Useless or just Obsolete?
-
I'm old. I know stuff. :)
".45 ACP - because shooting twice is just silly" - JSOP, 2010
-----
You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
-----
When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013John Simmons / outlaw programmer wrote:
I'm old. I know stuff.
Snaffled to sig!
Never underestimate the power of human stupidity - RAH I'm old. I know stuff - JSOP
-
Forogar wrote:
Useless or just Obsolete?
I'd like to think I'm both! ;P I'm not sure about Masters or PhDs but good old BS degrees in Computer Science are well worth the time, money and effort. Is it perfect? No. Does it weed out all the idiots? No. But before you throw out the baby with the bath water think of all the times your idiot neighbor said "Little Johnny is really good with computers*, he wants to get a job writing computer games!" *Turns out he knows how to connect the XBox to the TV.
Mike Mullikin wrote:
*Turns out he knows how to connect the XBox to the TV.
A skill I highly doubt you have.
Michael Martin Australia "I controlled my laughter and simple said "No,I am very busy,so I can't write any code for you". The moment they heard this all the smiling face turned into a sad looking face and one of them farted. So I had to leave the place as soon as possible." - Mr.Prakash One Fine Saturday. 24/04/2004
-
Looking at a few other recent posts it got me thinking about qualifications, degrees and such-like things. Many decades ago I got a PhD in Computer Science and, at the time, I thought it was a good thing. Now, when I look back at how useful it was to learn all that I realise that nearly everything I learned is obsolete and about as useful as knowing how a carburettor works in these days of fuel-injected engines. A large part of it was learning the history of computing, Charles Babbage and his Difference Engine, Blaise Pascal and Herman Hollerith with punched cards, punched tape and other punchy things. I even learned about Jacquard looms for early machine automation. Compare and contrast tape drives (high capacity serial data access) and disc drives (lower capacity but random access) - notice the spelling of "disc drive". COBOL, FORTRAN, Pascal and other modern computer languages. All good stuff at the time but completely irrelevant these days. Back in 1977, my thesis (I can't even remember the title) was based on distributed computing with small home computers or remote terminals at least, all connected together via a universal network where one could write documents, do reports on things using a database or data files, send messages to other computer users, order on-line, even play games alone or with other networked players. I even wrote some games (in assembler and FORTRAN) to demonstrate how this could work. Hah! Like any of that would really happen! :wtf: For a few years I was a professor, teaching all this stuff to poor souls who though it was all new and exciting... then I got a proper job and the rest is history. You'll have to wait for my autobiography to hear about jet fighters :cool:, MI5 :suss:, chasing bandits in the mountains behind Hong Kong X| , and other boring, non-computer related stuff. Oh, the tales I could tell, once the Official Secrets Act period has expired! :~ My son will shortly complete his second Masters degree and all he does is complain about how much money he owes on his student loan. Anyway, finally to the question... Do you think getting a degree these days is worth the time, effort and money or should we consider going back to the tried and trusted apprentice system (basically interns starting with minimal but focused, initial education)?
- I would love to change the world, but they won’t give me the source code.
Knowledge gets outdated...the syllabus should be updated with time...if people stopped getting degrees and doing research America would not be in this unique position it was in the world arena .Considering the way its going for profit only and screwing its own country men with job losses and student loans and debt... well its up to trump to make America great again and save the world from another disaster.
Caveat Emptor. "Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long
-
Mike Mullikin wrote:
*Turns out he knows how to connect the XBox to the TV.
A skill I highly doubt you have.
Michael Martin Australia "I controlled my laughter and simple said "No,I am very busy,so I can't write any code for you". The moment they heard this all the smiling face turned into a sad looking face and one of them farted. So I had to leave the place as soon as possible." - Mr.Prakash One Fine Saturday. 24/04/2004
That's what grandkids are for...
Never underestimate the power of human stupidity - RAH I'm old. I know stuff - JSOP
-
That's what grandkids are for...
Never underestimate the power of human stupidity - RAH I'm old. I know stuff - JSOP
Mycroft Holmes wrote:
That's what grandkids are for...
I'm not there yet, so I get the kids to do that stuff.
Michael Martin Australia "I controlled my laughter and simple said "No,I am very busy,so I can't write any code for you". The moment they heard this all the smiling face turned into a sad looking face and one of them farted. So I had to leave the place as soon as possible." - Mr.Prakash One Fine Saturday. 24/04/2004
-
I think a common problem people have when considering college is the idea that college is there to teach you a skill. It's not. College teaches to you learn. You might acquire some entry-level skill set related to some career path, but, as you pointed out, as soon as the ink on the diploma is dry the skills you learned are out of date. Successful people come out of college with the skills to do research, collate that data into useful information, communicate that to others, and then use it to solve problems. From a computer science perspective, you might acquire the ability to code in any number of languages, and to leverage a host of tools to do your job, but HOW you learn that how successful you are at acquiring that knowledge is a direct product of learning to learn.
If it's not broken, fix it until it is. Everything makes sense in someone's mind. Ya can't fix stupid.
Kevin Marois wrote:
College teaches to you learn.
Fascinating how things are different around different parts of the world... We learned to learn in the elementary/primary school (up to 14), since then it is real knowledge... Granted - you have no real-life experience after college, but you should have a lot of knowledge to help to do things in real environment... If the college is any decent, you not learning things from zero with your first job, but learning how to implement theory in real-life situations - and that called experience... An other aspect of good college is that you understand how knowledge is temporary and changing and you will pursue it even after 50 years you left... There are some, who fit the self-building process, but they are few (and they are good because they build themselves bottom-up). Most who has no a solid base are rarely become any good...
"The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge". Stephen Hawking, 1942- 2018
-
Looking at a few other recent posts it got me thinking about qualifications, degrees and such-like things. Many decades ago I got a PhD in Computer Science and, at the time, I thought it was a good thing. Now, when I look back at how useful it was to learn all that I realise that nearly everything I learned is obsolete and about as useful as knowing how a carburettor works in these days of fuel-injected engines. A large part of it was learning the history of computing, Charles Babbage and his Difference Engine, Blaise Pascal and Herman Hollerith with punched cards, punched tape and other punchy things. I even learned about Jacquard looms for early machine automation. Compare and contrast tape drives (high capacity serial data access) and disc drives (lower capacity but random access) - notice the spelling of "disc drive". COBOL, FORTRAN, Pascal and other modern computer languages. All good stuff at the time but completely irrelevant these days. Back in 1977, my thesis (I can't even remember the title) was based on distributed computing with small home computers or remote terminals at least, all connected together via a universal network where one could write documents, do reports on things using a database or data files, send messages to other computer users, order on-line, even play games alone or with other networked players. I even wrote some games (in assembler and FORTRAN) to demonstrate how this could work. Hah! Like any of that would really happen! :wtf: For a few years I was a professor, teaching all this stuff to poor souls who though it was all new and exciting... then I got a proper job and the rest is history. You'll have to wait for my autobiography to hear about jet fighters :cool:, MI5 :suss:, chasing bandits in the mountains behind Hong Kong X| , and other boring, non-computer related stuff. Oh, the tales I could tell, once the Official Secrets Act period has expired! :~ My son will shortly complete his second Masters degree and all he does is complain about how much money he owes on his student loan. Anyway, finally to the question... Do you think getting a degree these days is worth the time, effort and money or should we consider going back to the tried and trusted apprentice system (basically interns starting with minimal but focused, initial education)?
- I would love to change the world, but they won’t give me the source code.
I don't think the knowledge I gained in my degree course back in 1988 was fantastically useful however there'e the degree and there's going to university and the two are not always separate. I would say that going to university can be useful - in my case my third year was spent working as a COBOL programmer so it was my first real job and I got a sense of what I didn't want to do. I think if I had not gone to university I would not have gained to confidence to take on perhaps more demanding job roles. On the specific topic of degrees - I think if one wants to become a developer it's probably better to spend the money educating yourself for three years as well as participating in online communities and writing and publishing software. I don't think degrees necessarily prepare people for the work environment.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
-
Looking at a few other recent posts it got me thinking about qualifications, degrees and such-like things. Many decades ago I got a PhD in Computer Science and, at the time, I thought it was a good thing. Now, when I look back at how useful it was to learn all that I realise that nearly everything I learned is obsolete and about as useful as knowing how a carburettor works in these days of fuel-injected engines. A large part of it was learning the history of computing, Charles Babbage and his Difference Engine, Blaise Pascal and Herman Hollerith with punched cards, punched tape and other punchy things. I even learned about Jacquard looms for early machine automation. Compare and contrast tape drives (high capacity serial data access) and disc drives (lower capacity but random access) - notice the spelling of "disc drive". COBOL, FORTRAN, Pascal and other modern computer languages. All good stuff at the time but completely irrelevant these days. Back in 1977, my thesis (I can't even remember the title) was based on distributed computing with small home computers or remote terminals at least, all connected together via a universal network where one could write documents, do reports on things using a database or data files, send messages to other computer users, order on-line, even play games alone or with other networked players. I even wrote some games (in assembler and FORTRAN) to demonstrate how this could work. Hah! Like any of that would really happen! :wtf: For a few years I was a professor, teaching all this stuff to poor souls who though it was all new and exciting... then I got a proper job and the rest is history. You'll have to wait for my autobiography to hear about jet fighters :cool:, MI5 :suss:, chasing bandits in the mountains behind Hong Kong X| , and other boring, non-computer related stuff. Oh, the tales I could tell, once the Official Secrets Act period has expired! :~ My son will shortly complete his second Masters degree and all he does is complain about how much money he owes on his student loan. Anyway, finally to the question... Do you think getting a degree these days is worth the time, effort and money or should we consider going back to the tried and trusted apprentice system (basically interns starting with minimal but focused, initial education)?
- I would love to change the world, but they won’t give me the source code.
Getting a degree is worth it for 2 reasons. 1. You learn to learn, that alone is a huge boon. 2. Stuff like how a linked list or a B+-tree works internally never gets old. Those theoretical concepts are still very much relevant today. You should either refrain from learning too-particular things (that was my prof's aporach) or distill the essense out of particular knowledge to apply it to other things running on the same principles. Example: While we can agree that a modern CPU is orders of magnitude more complex, than the venerable 8080, the basic concepts are still the same. I personally am a fan of learning actually (at that time) useful things as learning theory without any grounding in reality isn't the way my brain works, but distill the essense to apply it to new fields. And let's be real, truly new fields are few far and between. The most progress computer science has been having for half a century is old wine in new (way fancier and bigger) bottles.
-
And all you will hear is: "How do you get WiFi on this thing?" "Where's facebook?" "When I swipe left, nothing happens." :sigh:
Sent from my Amstrad PC 1640 Never throw anything away, Griff Bad command or file name. Bad, bad command! Sit! Stay! Staaaay... AntiTwitter: @DalekDave is now a follower!
There would be nothing to swipe since the phones were all dead.
-
Looking at a few other recent posts it got me thinking about qualifications, degrees and such-like things. Many decades ago I got a PhD in Computer Science and, at the time, I thought it was a good thing. Now, when I look back at how useful it was to learn all that I realise that nearly everything I learned is obsolete and about as useful as knowing how a carburettor works in these days of fuel-injected engines. A large part of it was learning the history of computing, Charles Babbage and his Difference Engine, Blaise Pascal and Herman Hollerith with punched cards, punched tape and other punchy things. I even learned about Jacquard looms for early machine automation. Compare and contrast tape drives (high capacity serial data access) and disc drives (lower capacity but random access) - notice the spelling of "disc drive". COBOL, FORTRAN, Pascal and other modern computer languages. All good stuff at the time but completely irrelevant these days. Back in 1977, my thesis (I can't even remember the title) was based on distributed computing with small home computers or remote terminals at least, all connected together via a universal network where one could write documents, do reports on things using a database or data files, send messages to other computer users, order on-line, even play games alone or with other networked players. I even wrote some games (in assembler and FORTRAN) to demonstrate how this could work. Hah! Like any of that would really happen! :wtf: For a few years I was a professor, teaching all this stuff to poor souls who though it was all new and exciting... then I got a proper job and the rest is history. You'll have to wait for my autobiography to hear about jet fighters :cool:, MI5 :suss:, chasing bandits in the mountains behind Hong Kong X| , and other boring, non-computer related stuff. Oh, the tales I could tell, once the Official Secrets Act period has expired! :~ My son will shortly complete his second Masters degree and all he does is complain about how much money he owes on his student loan. Anyway, finally to the question... Do you think getting a degree these days is worth the time, effort and money or should we consider going back to the tried and trusted apprentice system (basically interns starting with minimal but focused, initial education)?
- I would love to change the world, but they won’t give me the source code.
The classical way of learning a craft was to become an apprentice, after a few years passing a practical test to be entitled a Journeyman, and maybe some years later demonstrating that you can manage a complete work task where you have do demonstrate a large number of skills, to become a Master. In modern Norwegian education, that is still the way to become a craftsman, but the practical training is now interspersed with classroom lessons where you learn not only what to do, but why to do it. Theory with a very practical orientation. And, knowing both what to do and why is a very good combination. Those who have done all of their learning in a classroom may know the "why"s better, but may be clueless about the "what". Lately, the tradidional classroom teaching of computer science has, here in Norway, been supplemented with a program similar to the old crafts learning: You are hired at a software house as an apprentice, working with a skilled programmer, but spending a few hours every week taking classes at a local college to learn the necessary "why"s. After a few years, you may go through a public exam to become a "bachelor" - the old "journeyman" term is not used any more, but that's just another name for the same thing. This kind of education is so new that I am not sure if anyone has gone further to become a Master though practical work (supplemented by practical theory). I am very much in favor of this educational system. We may need purely theoretical education as well, but as a supplement to that (and a major one!), I think it is very valuable.
-
W∴ Balboos wrote:
Just go to Q&A and see what computer science courses are bringing you
That is a very unfair argument! Q&A hosts mostly the idjits who are too lazy to do the basic research required. The minimal number of good questions are from the potentially competent coders!
Never underestimate the power of human stupidity RAH
Well - we've had work done by (youngish) contractors (outsourced). I'll be generous and call it shyte. The good-to-great coders are going to be inspired. Self-taught, even if in the CS courses, because they just can't help themselves. Taking the course - with the idea of it being a lucrative vocation "with a future" - is only producing a bunch of drones who actually could be replace by software that rights software.
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein
"If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010
-
Forogar wrote:
Do you think getting a degree these days is worth the time
Yes! I had the opportunity to work with youngsters (20-25 years younger than me, and I'm from 1972), who had only a bunch of courses... Even they are the best quality, they lack the solid foundation that a good degree gives you. And that lack of foundation makes the bad dengourous and the good frustrated (first hand experience wit them)...
"The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge". Stephen Hawking, 1942- 2018
I've worked with folks who had 15-20 years experience, yet lacked a basic understanding of how a database works, nor understood why coding for efficient execution matters when you have 100,000 concurrent users.
-
Getting a degree is worth it for 2 reasons. 1. You learn to learn, that alone is a huge boon. 2. Stuff like how a linked list or a B+-tree works internally never gets old. Those theoretical concepts are still very much relevant today. You should either refrain from learning too-particular things (that was my prof's aporach) or distill the essense out of particular knowledge to apply it to other things running on the same principles. Example: While we can agree that a modern CPU is orders of magnitude more complex, than the venerable 8080, the basic concepts are still the same. I personally am a fan of learning actually (at that time) useful things as learning theory without any grounding in reality isn't the way my brain works, but distill the essense to apply it to new fields. And let's be real, truly new fields are few far and between. The most progress computer science has been having for half a century is old wine in new (way fancier and bigger) bottles.
Kirill Illenseer wrote:
Getting a degree is worth it for 2 reasons. 1. You learn to learn, that alone is a huge boon. 2. Stuff like how a linked list or a B+-tree works internally never gets old.
Spot on! Lack of knowledge of how and why things work is a detriment in any field. As others have mentioned, it matters which degree one gets. I work in private industry and government as a contractor and later FTE -- have an AAS in CS and BS in CS/Mathmetics. The "learning how to learn" and general background have benefited me throughout my career. The things I did as recently as 10 years ago have no direct relevance to my current job. Everything changes, often too rapidly, so we keep on learning and building upon what we already know. A PhD would have done nothing for me. I sort of regret not getting a MS, but focused on learning new technologies and getting relevant certifications. My career might have been different with a MS, but I can't say that it would be better -- just different. Personally, certifications have done absolutely nothing in terms of making me better at anything -- BUT -- as others have mentioned, it's a checkbox. Ya got the right certs, ya get picked for interviews. Please note that certifications made me REALLY good at successfully taking tests. :laugh:
-
Looking at a few other recent posts it got me thinking about qualifications, degrees and such-like things. Many decades ago I got a PhD in Computer Science and, at the time, I thought it was a good thing. Now, when I look back at how useful it was to learn all that I realise that nearly everything I learned is obsolete and about as useful as knowing how a carburettor works in these days of fuel-injected engines. A large part of it was learning the history of computing, Charles Babbage and his Difference Engine, Blaise Pascal and Herman Hollerith with punched cards, punched tape and other punchy things. I even learned about Jacquard looms for early machine automation. Compare and contrast tape drives (high capacity serial data access) and disc drives (lower capacity but random access) - notice the spelling of "disc drive". COBOL, FORTRAN, Pascal and other modern computer languages. All good stuff at the time but completely irrelevant these days. Back in 1977, my thesis (I can't even remember the title) was based on distributed computing with small home computers or remote terminals at least, all connected together via a universal network where one could write documents, do reports on things using a database or data files, send messages to other computer users, order on-line, even play games alone or with other networked players. I even wrote some games (in assembler and FORTRAN) to demonstrate how this could work. Hah! Like any of that would really happen! :wtf: For a few years I was a professor, teaching all this stuff to poor souls who though it was all new and exciting... then I got a proper job and the rest is history. You'll have to wait for my autobiography to hear about jet fighters :cool:, MI5 :suss:, chasing bandits in the mountains behind Hong Kong X| , and other boring, non-computer related stuff. Oh, the tales I could tell, once the Official Secrets Act period has expired! :~ My son will shortly complete his second Masters degree and all he does is complain about how much money he owes on his student loan. Anyway, finally to the question... Do you think getting a degree these days is worth the time, effort and money or should we consider going back to the tried and trusted apprentice system (basically interns starting with minimal but focused, initial education)?
- I would love to change the world, but they won’t give me the source code.
your ? is a Degree worth it or should we use an Apprentice system. My answer is YES. Degrees can be worth it. But keep in mind someone with a two year degree at a cheap community college and then the following 2-3 years at a cheaper 4 year school will earn as much as a person with a degree from an expensive 4(5 probably) year school. Also, a person with a 2 year degree who then goes and works in the industry for 3 years will be making as much if not more than the person with the 4 year degree right out of school. and probably be more productive at that time. In 3-6 months it(production) won't matter one whit. I also know of a person who did a code camp and got a job and is doing fairly well for themselves. But, there are foundational things that can only be learned by taking the time to study them. These are important people. So Yes. We should depending upon the person. My three children. one I would recommend a code camp too. She would be a great programmer. The other I would only recommend a 4 year degree. She is awesome and would be a great business analyst in the long run. My youngest. a 2 year degree and get to work boy. HE would be an amazing DBA. Alas. They have their own ideas. But that is how they would work best. The best solution fits the person it is aimed at.
To err is human to really mess up you need a computer
-
100% yes at least with STEM careers. Nearly every HR company filters on degree first so without one you won't even get eyes on your resume/CV. You're up against hundreds if not thousands of other applicants and they can't read every application. Even if you get through, you're at an enormous disadvantage. Consider that companies don't necessarily want "the best." They want the safest choice that can get the job done. Degree = safety (to some extent). No degree = risk. As much as I despise this system, that's how it is. I've gotten to about eight final interviews over the past 2-ish years, some for senior positions, and when I don't get a call I always reach out to whoever my HR contact was to ask "What could have strengthened my candidacy in the final steps of the process?" The couple responses I've gotten were a degree. Which is why I'm currently finishing my degree so I can land a decent job. As far as apprenticeship vs degree, I like the middle-ground. I have a friend that is a lineman (works on power lines, transformers, etc). The way they do it is basically take a 4-year degree, strip out all the unrelated classes, then you do both schooling and an apprenticeship for those years. I believe the first year is just schooling but apprentices also get paid a fair wage (it isn't free labor like many CS/SE internships). After the schooling, you'll still be an apprentice until your mentor signs off that you're ready to go at it on your own. Side Note: Nearly all those interviews I've landed the company used a test project to filter candidates which is why I imagine I made it into the process without a degree. Many companies don't bother doing this. Also I can only speak to the US.
Jon McKee wrote:
100% yes at least with STEM careers. Nearly every HR company filters on degree first so without one you won't even get eyes on your resume/CV. You're up against hundreds if not thousands of other applicants and they can't read every application. Even if you get through, you're at an enormous disadvantage. Consider that companies don't necessarily want "the best." They want the safest choice that can get the job done. Degree = safety (to some extent). No degree = risk.
As an acquaintance of mine put it ~15 years ago, the Bachelors Degree has become the modern white collar union card. It's not a necessary or sufficient condition to prove you're qualified to do a job any more than Grandpa's plumbers union card ever was; but without a BS/BA you'll be locked out of a very large fraction of today's jobs just like he wouldn't be allowed on most large construction sites without his card.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason? Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful? --Zachris Topelius Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies. -- Sarah Hoyt
-
All hail the mighty abacus :)
Soroban makes more sense in the base 10 system.
-
Looking at a few other recent posts it got me thinking about qualifications, degrees and such-like things. Many decades ago I got a PhD in Computer Science and, at the time, I thought it was a good thing. Now, when I look back at how useful it was to learn all that I realise that nearly everything I learned is obsolete and about as useful as knowing how a carburettor works in these days of fuel-injected engines. A large part of it was learning the history of computing, Charles Babbage and his Difference Engine, Blaise Pascal and Herman Hollerith with punched cards, punched tape and other punchy things. I even learned about Jacquard looms for early machine automation. Compare and contrast tape drives (high capacity serial data access) and disc drives (lower capacity but random access) - notice the spelling of "disc drive". COBOL, FORTRAN, Pascal and other modern computer languages. All good stuff at the time but completely irrelevant these days. Back in 1977, my thesis (I can't even remember the title) was based on distributed computing with small home computers or remote terminals at least, all connected together via a universal network where one could write documents, do reports on things using a database or data files, send messages to other computer users, order on-line, even play games alone or with other networked players. I even wrote some games (in assembler and FORTRAN) to demonstrate how this could work. Hah! Like any of that would really happen! :wtf: For a few years I was a professor, teaching all this stuff to poor souls who though it was all new and exciting... then I got a proper job and the rest is history. You'll have to wait for my autobiography to hear about jet fighters :cool:, MI5 :suss:, chasing bandits in the mountains behind Hong Kong X| , and other boring, non-computer related stuff. Oh, the tales I could tell, once the Official Secrets Act period has expired! :~ My son will shortly complete his second Masters degree and all he does is complain about how much money he owes on his student loan. Anyway, finally to the question... Do you think getting a degree these days is worth the time, effort and money or should we consider going back to the tried and trusted apprentice system (basically interns starting with minimal but focused, initial education)?
- I would love to change the world, but they won’t give me the source code.
I happen to be a self taught person, but I have had the good fortune to work for companies that have let me explore not only technology, but also business operations, from people who really understood what they were doing. As I've gone through my career, I had started working for larger and larger companies and observed that the largest companies tend to employ very few people that understand how business and technology work, and especially how they work together. I tend to believe that we have too many people getting into technology and/or business without the proper aptitude. People seem to think that just anyone can get into tech, but it's much more nuanced than that. I think that CS education still has its place, but not like it used to. People still need to know theory because we still need to more computing technology forward. What is missing is an education track for business developers. CS graduates shouldn't want to make an application, they should want to make the next operating system to run it. They shouldn't want to make a web page or web application, they should be wanting to write the actual web server. There should be a business software track that focuses on line of business application development, since that's the majority of what businesses are looking for. People who understand the general technology and put it to use for business purposes. Whether we like it or not, the majority of software that needs written is for accounting or business operations, not the "next big thing." It's our job as developers to make the business more efficient and to give value. It's like the difference between a doctor and a nurse, or an accounting and a bookkeeper. Both are needed, but in different quantities and for different things.
-
Looking at a few other recent posts it got me thinking about qualifications, degrees and such-like things. Many decades ago I got a PhD in Computer Science and, at the time, I thought it was a good thing. Now, when I look back at how useful it was to learn all that I realise that nearly everything I learned is obsolete and about as useful as knowing how a carburettor works in these days of fuel-injected engines. A large part of it was learning the history of computing, Charles Babbage and his Difference Engine, Blaise Pascal and Herman Hollerith with punched cards, punched tape and other punchy things. I even learned about Jacquard looms for early machine automation. Compare and contrast tape drives (high capacity serial data access) and disc drives (lower capacity but random access) - notice the spelling of "disc drive". COBOL, FORTRAN, Pascal and other modern computer languages. All good stuff at the time but completely irrelevant these days. Back in 1977, my thesis (I can't even remember the title) was based on distributed computing with small home computers or remote terminals at least, all connected together via a universal network where one could write documents, do reports on things using a database or data files, send messages to other computer users, order on-line, even play games alone or with other networked players. I even wrote some games (in assembler and FORTRAN) to demonstrate how this could work. Hah! Like any of that would really happen! :wtf: For a few years I was a professor, teaching all this stuff to poor souls who though it was all new and exciting... then I got a proper job and the rest is history. You'll have to wait for my autobiography to hear about jet fighters :cool:, MI5 :suss:, chasing bandits in the mountains behind Hong Kong X| , and other boring, non-computer related stuff. Oh, the tales I could tell, once the Official Secrets Act period has expired! :~ My son will shortly complete his second Masters degree and all he does is complain about how much money he owes on his student loan. Anyway, finally to the question... Do you think getting a degree these days is worth the time, effort and money or should we consider going back to the tried and trusted apprentice system (basically interns starting with minimal but focused, initial education)?
- I would love to change the world, but they won’t give me the source code.
I have a MS, I graduated 4-5 years ago. The MS was pretty much useless almost nothing I learned there transferred in to the real world other than the extra experience I got writing programs for my professor and some web design that went with it. However a lot of the people who I took classes with did take a lot into their jobs but they were already working at a lab and being paid to get their MS. They were already doing research for work and just transferred that to be their thesis. However doing my undergraduate work I feel was very useful, I learned how to learn to program in any language. They taught us Python C C++ and Java which gave us a pretty good base to start from but they focused on the design of programs instead of the languages themselves.
-
Kevin Marois wrote:
College teaches to you learn.
Fascinating how things are different around different parts of the world... We learned to learn in the elementary/primary school (up to 14), since then it is real knowledge... Granted - you have no real-life experience after college, but you should have a lot of knowledge to help to do things in real environment... If the college is any decent, you not learning things from zero with your first job, but learning how to implement theory in real-life situations - and that called experience... An other aspect of good college is that you understand how knowledge is temporary and changing and you will pursue it even after 50 years you left... There are some, who fit the self-building process, but they are few (and they are good because they build themselves bottom-up). Most who has no a solid base are rarely become any good...
"The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge". Stephen Hawking, 1942- 2018
In the USA, unfortunately, the education system really is a fancy baby-sitting service until high school. At high school some basic, relatively useless, skills are taught. True comprehension of material is really only taught in colleges. One of the best descriptions I have heard is that a two-year degree provides enough knowledge to use software tools, four years allows you to understand the tools, and six years will provide you the skills to create the tools. Is it possible to learn to use a tool in less than two years? Certainly; there are tons of "learn in a week" sessions and such. "Bootcamps" exist to teach intense training of how to code, and will allow the graduate to do so. If nothing unexpected goes wrong. Experience, with some guidance, is what builds the ability to fix a problem. College is basically an institutionalized apprenticeship.
-
Looking at a few other recent posts it got me thinking about qualifications, degrees and such-like things. Many decades ago I got a PhD in Computer Science and, at the time, I thought it was a good thing. Now, when I look back at how useful it was to learn all that I realise that nearly everything I learned is obsolete and about as useful as knowing how a carburettor works in these days of fuel-injected engines. A large part of it was learning the history of computing, Charles Babbage and his Difference Engine, Blaise Pascal and Herman Hollerith with punched cards, punched tape and other punchy things. I even learned about Jacquard looms for early machine automation. Compare and contrast tape drives (high capacity serial data access) and disc drives (lower capacity but random access) - notice the spelling of "disc drive". COBOL, FORTRAN, Pascal and other modern computer languages. All good stuff at the time but completely irrelevant these days. Back in 1977, my thesis (I can't even remember the title) was based on distributed computing with small home computers or remote terminals at least, all connected together via a universal network where one could write documents, do reports on things using a database or data files, send messages to other computer users, order on-line, even play games alone or with other networked players. I even wrote some games (in assembler and FORTRAN) to demonstrate how this could work. Hah! Like any of that would really happen! :wtf: For a few years I was a professor, teaching all this stuff to poor souls who though it was all new and exciting... then I got a proper job and the rest is history. You'll have to wait for my autobiography to hear about jet fighters :cool:, MI5 :suss:, chasing bandits in the mountains behind Hong Kong X| , and other boring, non-computer related stuff. Oh, the tales I could tell, once the Official Secrets Act period has expired! :~ My son will shortly complete his second Masters degree and all he does is complain about how much money he owes on his student loan. Anyway, finally to the question... Do you think getting a degree these days is worth the time, effort and money or should we consider going back to the tried and trusted apprentice system (basically interns starting with minimal but focused, initial education)?
- I would love to change the world, but they won’t give me the source code.
College is an institutionalized apprenticeship. The system was adopted because not every tradesman has an ability to pass skills to an apprentice, and if the apprentice has chosen a bad master, his life is doomed to failure. By institutionalizing the process, colleges gain reputations for how well their faculty pass skills to students, thus the apprentice can choose a college that has some proficiency in what he wishes to do. Sadly, other factors contribute to a good reputation where such may not be deserved: research by faculty (which in no way reflects teaching ability), size, and, of course, athletics. I do tend to believe colleges also provide an overall perspective for students. A respect for other disciplines, perhaps especially the arts, are a necessary part of college life. This "renaissance" way of thinking creates a more respectful and tolerant individual who knows there is much more to life than coding twelve hours a day then going home and either coding for pleasure or playing video games for numbing one's mind. It is perhaps odd that a man with a Bachelors in Mathematics, Computer Science and Physics, followed up with a Masters in Computer Science, would have such a high opinion of the arts; however, I have found that history is mostly cyclical, and that science moves us forward while creating other problems that must be further solved by more science. The arts provide us with a different perspective to occasionally break that cycle. Do I believe in college? I proudly sent my sons off to college. Their choices differed from my choice at that age: they have different interests, so I didn't push my school on them as some parents do. Expensive? Oh, oh, yeah. Will it give them advantages when going to get jobs? Maybe, but probably not: they did not attend Ivy League or top ten universities. Do I believe it was worth it? Absolutely; I can see it every time they solve any situation of their lives. Given all of that, would I do it again? In a heartbeat!