I like this question ... There's a song, "Everything Old is New Again". If you can find an MP3 snaffle it. * [] 'When' I started - Networked operating systems (like what evolved to AmigaDOS, CTOS, SNA 6.2) and died with the internet. Networked data models (like what relational systems killed off) and O/R ORM-s and Object Graphs reinvent. Hand-held devices like my HP calculator, small computers. Useful languages like Algol-68, Simula, Smalltalk, awk, SNOBOL -- May be 4GL ideas that made solutions tractable -- BUT still employers want you to code in (um) 2nd level assembler languages. I 'get' it. Language technology only ads 3% to our development productivity (depending on 'how' its measured).
* I'll debate that because 4GL-s also add value to the business analysis (that gives you 60% of project performance -- Anyway you measure it).
The biggest "next big thing" was that management would learn something about software development (because at that time, there were no developer managers, just people wanting a H-RESULT :laugh:). ... I'm older now. I've worked my way through 2 masters degrees on technology and management so I might be wiser (grunt:suss:). The intractable software development problem is NOT technical or technology. If I can comment; an architecture professional told me the 'same' profile of the same story for builders. In contrast chemical engineers 'prescribe' below 10% +/- deviation on projects. If your project exceeds -- It dies. I'm sure everyone knows what happens when the costs on phase I go past 10%? The "next big thing" for me? They all failed to deliver the promise. If I find smart people, we can fulfil that promise (from 40 years ago). When Apple was new. DEC was new. HP was old, IBM was old. 3M was old then. There was NO 'Microsoft', 'Google', 'Yahoo', 'no PC-s'. W