What? I thought eating a cookie was an instinctive class! Intro first, I've been "around" programming forever; but recently completed a degree. Perhaps that provides a useful viewpoint. My observation: Schools are trying to touch on every subject, leaving time to go in depth on none of them. I suspect this is coming from the recruitment process having 20+ item lists of required skills, in an even more impossible number of combinations. I've successfully coded many moderate-level projects over the years, and "learned" my way thru every one. That said, the degree I've recently completed taught me very little new material that I couldn't have Googled in two seconds (ref. to development); except maybe some style conventions. On how many different platforms do I need to know how to say "Hello World?" I do know some higher-levels, but I learned them partly in classes many years ago, and from independent research since then. And no, I don't recall cookies being included in any of my classes - but they certainly should have been! My solution: Schools need to pick one or two pathways (language, platform, whatever) and focus on it/them into advanced levels. It's not the platform or language that is important, it's the concepts (granted, that's much harder for HR to access). Throw in some translation/conversion skills, and you'll not only have advanced skill sets, but also candidates prepared for the next 1000 platforms that come out during a students career. That can only happen if the industry leaders change their recruitment strategies. The colleges are simply trying to follow job demands. I realize this all varies from college to college, and perhaps mine isn't big on development, but I suspect there is commonality regardless.
U
User 11232861
@User 11232861