I'm a Relic
-
I remember fondly the heyday of computer programmers. We were a curious mix of wizards and gods, silently tapping away at keyboards, shunning those new-fangled mouse things as long as possible. We were cowboys, outlaws, and warrior poets weaving titanic tales of bytes and opcodes, roaming the electronic frontier during the burgeoning era of personal computers, free to do as we pleased, and answering only to our peers. We could cram amazing amounts of code into just 4K of memory because we knew assembly language and we knew the value of just a single byte of memory. We fed off the tit of mother COBOL, and her evil cousin, Fortran, and we praised Pascal for it's type safety, and sheer elegance. We dabbled fearlessly in LISP, mastered the DOS commandline, knew the difference between extended and expanded memory, and decided early on that Windows was Hell incarnate. We taught ourselves C and then C++, still thinking tight and efficient code mattered to someone other than ourselves. We struggled to learn MFC's quirks and eventually began to fondly recall the exquisite and deft code used to circumvent the library's limitations, or as we put it, extend it's usefulness. And then came .Net and cookie-cutter applications. Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before - those who could write 100,000 line programs from scratch with nothing more than a few hastily scratched verses on a post-it note. I'm a relic. I like the old days. I like the old ways. There. I've said it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001Yes! Just look for the 5 kilobytes Wolfenstein clone made in javascript and you will see relics still live in remote regions of the wild Earth.
---------- Siderite
-
Christopher Duncan wrote:
Whoever decided that HTML was a valid basis for application programming should be taken outside and summarily executed, in an exceedingly slow and clumsy manner so as to be a fitting punishment
Can we really do this? I will contribute to start a search for the first one who took that unspeakable action!!! I do think that we need to work more on that punishment though.
Umm how about a clumsy environment with single dialog box for every field & devoid of any shortcuts, used for data entry. And abasing him by sentencing "Data Entry till death". A clue of such environment - Something like : 1. Enter the first name : ________ Then a question thereof. "Are you sure this is the first name?" If yes then 2. Enter the middle name : _______ Then again the similar question. "Are you sure this is the second name?" If yes then 3. Enter the last name : ________ And yes a question again.:-D This is a mere description of the Name Entry session. Next would be for Father's Name. Similarily for Address, Telephone Number, SSN and so on. The guilty would serve the world and pay for his deceit simultaneously :) How painful is that?
-
WordStar, heh, heh, I remember the CPM/MPM days....
............................. There's nothing like the sound of incoming rifle and mortar rounds to cure the blues. No matter how down you are, you take an active and immediate interest in life. Fiat justitia, et ruat cælum
-
Paul Watson wrote:
HTML sort of works. It is here now.
The same was true of DOS. HTML is a markup language to display and link static documents, and for that it does fine. It wasn't designed as a foundation to make the web browser a virtual operating system for application programming. Nonetheless, that's what it is these days. It's evolved, true. From a massively clumsy environment to just "mostly clumsy" (similar to Hitchhikers guide's classification of Earth :)). Client application programming (or system programming) in Windows offers an incredibly rich feature set and the tools for a much more elegant UI. Instead of leveraging this power and bolting on a global TCP/IP network, we're trying to make chicken salad out of, er, chicken droppings. True, the web is cross platform, and I think that's it's biggest benefit. However, what if we took the same approach to global standards, augmenting the web browser with another cross platform environment that actually was designed for programming and a UI and desktop experience? I'm telling you, there's a Killer App in here somewhere...
Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com
It isn't just cross platform, it is instantly deployable. No installation, no downloading files and running, no one click installers even. You just point your browser at an address and whatever that address wants to show you, it can. When it updates it updates. And if you have a browser most of the stuff out there just works. No extra downloads, no .NET Frameworks or operating system requirements. Sure, some sites need Flash (98% penetration) or Java (penetrated) and you get quirks between browsers but hardly ever show stopping. No firewall problems either, most firewalls are happy with :80. All of this can be improved by other systems and new software or browser plugins but those all start at a huge disadvantage; 0 penetration. Not just 0 penetration on target systems but 0 penetration in developers minds, in documentation, examples, hacks, workarounds, tutorials, lectures, conferences, tools, vendors, platforms etc. etc. etc. And whatever this Killer App is it has to be an open standard, it has to be easy to hack (HTML over HTTP is brilliantly easy to hack, it is just text. To me binary standards are dead in the water for hacking) and it needs to be uncomplicated. The attempted solutions have all been too complicated, overengineered. They solve everything including the edge cases but they make it difficult for the 99% of problems that aren't edge cases. We don't always need to be able to build a kitchen sink :)
regards, Paul Watson Ireland FeedHenry needs you
eh, stop bugging me about it, give it a couple of days, see what happens.
-
I remember fondly the heyday of computer programmers. We were a curious mix of wizards and gods, silently tapping away at keyboards, shunning those new-fangled mouse things as long as possible. We were cowboys, outlaws, and warrior poets weaving titanic tales of bytes and opcodes, roaming the electronic frontier during the burgeoning era of personal computers, free to do as we pleased, and answering only to our peers. We could cram amazing amounts of code into just 4K of memory because we knew assembly language and we knew the value of just a single byte of memory. We fed off the tit of mother COBOL, and her evil cousin, Fortran, and we praised Pascal for it's type safety, and sheer elegance. We dabbled fearlessly in LISP, mastered the DOS commandline, knew the difference between extended and expanded memory, and decided early on that Windows was Hell incarnate. We taught ourselves C and then C++, still thinking tight and efficient code mattered to someone other than ourselves. We struggled to learn MFC's quirks and eventually began to fondly recall the exquisite and deft code used to circumvent the library's limitations, or as we put it, extend it's usefulness. And then came .Net and cookie-cutter applications. Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before - those who could write 100,000 line programs from scratch with nothing more than a few hastily scratched verses on a post-it note. I'm a relic. I like the old days. I like the old ways. There. I've said it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001I remember fondly the "good old days" like you. I was a 6502 microprocessor assembly language programmer - had to "hand assemble" many of the programs I wrote. Even today I still remember most of the opcodes: A5 = LDA immediate, 85 = STA immediate, EA=NOP, etc. I remember those days fondly and just the other day I found a copy of VisiCalc that would run on todays PC and toyed with it. Would I want to give up my Visual Studio.NET RAD and go back to hand assembly? Give up my Excel 2003 for VisiCalc? To return to the "good old days"? Hmmm .... for the fun and atmosphere back in those glory days, heck yes. It was a great time with many world changing events going on. CW
-
It isn't just cross platform, it is instantly deployable. No installation, no downloading files and running, no one click installers even. You just point your browser at an address and whatever that address wants to show you, it can. When it updates it updates. And if you have a browser most of the stuff out there just works. No extra downloads, no .NET Frameworks or operating system requirements. Sure, some sites need Flash (98% penetration) or Java (penetrated) and you get quirks between browsers but hardly ever show stopping. No firewall problems either, most firewalls are happy with :80. All of this can be improved by other systems and new software or browser plugins but those all start at a huge disadvantage; 0 penetration. Not just 0 penetration on target systems but 0 penetration in developers minds, in documentation, examples, hacks, workarounds, tutorials, lectures, conferences, tools, vendors, platforms etc. etc. etc. And whatever this Killer App is it has to be an open standard, it has to be easy to hack (HTML over HTTP is brilliantly easy to hack, it is just text. To me binary standards are dead in the water for hacking) and it needs to be uncomplicated. The attempted solutions have all been too complicated, overengineered. They solve everything including the edge cases but they make it difficult for the 99% of problems that aren't edge cases. We don't always need to be able to build a kitchen sink :)
regards, Paul Watson Ireland FeedHenry needs you
eh, stop bugging me about it, give it a couple of days, see what happens.
Granted, all good and worthy attributes. But good attributes for a DOS app still leaves you with a DOS app (or its latter day equivalent). Given the fact that I've always admired your idealism, I wouldn't have figured you for the going along with the herd type when a standard is clearly sub par. Say, you didn't sneak off and join the real world when we weren't looking, did you? :-D
Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com
-
Granted, all good and worthy attributes. But good attributes for a DOS app still leaves you with a DOS app (or its latter day equivalent). Given the fact that I've always admired your idealism, I wouldn't have figured you for the going along with the herd type when a standard is clearly sub par. Say, you didn't sneak off and join the real world when we weren't looking, did you? :-D
Author of The Career Programmer and Unite the Tribes www.PracticalStrategyConsulting.com
hehe. I love HTML, CSS and JavaScript. It works amazingly well. The whole concept of linked documents with interactive elements and all served by servers that can do amazing processing. Brilliant. All the replacements I have seen are simply "Lets do Windows in a browser" and that sucks. They all loose URLs, they loose text's hackability, they loose openess and a boat load of things that HTML over HTTP brings. This is why I am more keen on supporting WHATWG, XHTML Forms 2.0 and open standards that are building on HTML over HTTP rather than replacing it.
regards, Paul Watson Ireland FeedHenry needs you
eh, stop bugging me about it, give it a couple of days, see what happens.
-
I remember fondly the heyday of computer programmers. We were a curious mix of wizards and gods, silently tapping away at keyboards, shunning those new-fangled mouse things as long as possible. We were cowboys, outlaws, and warrior poets weaving titanic tales of bytes and opcodes, roaming the electronic frontier during the burgeoning era of personal computers, free to do as we pleased, and answering only to our peers. We could cram amazing amounts of code into just 4K of memory because we knew assembly language and we knew the value of just a single byte of memory. We fed off the tit of mother COBOL, and her evil cousin, Fortran, and we praised Pascal for it's type safety, and sheer elegance. We dabbled fearlessly in LISP, mastered the DOS commandline, knew the difference between extended and expanded memory, and decided early on that Windows was Hell incarnate. We taught ourselves C and then C++, still thinking tight and efficient code mattered to someone other than ourselves. We struggled to learn MFC's quirks and eventually began to fondly recall the exquisite and deft code used to circumvent the library's limitations, or as we put it, extend it's usefulness. And then came .Net and cookie-cutter applications. Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before - those who could write 100,000 line programs from scratch with nothing more than a few hastily scratched verses on a post-it note. I'm a relic. I like the old days. I like the old ways. There. I've said it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001 -
I remember fondly the heyday of computer programmers. We were a curious mix of wizards and gods, silently tapping away at keyboards, shunning those new-fangled mouse things as long as possible. We were cowboys, outlaws, and warrior poets weaving titanic tales of bytes and opcodes, roaming the electronic frontier during the burgeoning era of personal computers, free to do as we pleased, and answering only to our peers. We could cram amazing amounts of code into just 4K of memory because we knew assembly language and we knew the value of just a single byte of memory. We fed off the tit of mother COBOL, and her evil cousin, Fortran, and we praised Pascal for it's type safety, and sheer elegance. We dabbled fearlessly in LISP, mastered the DOS commandline, knew the difference between extended and expanded memory, and decided early on that Windows was Hell incarnate. We taught ourselves C and then C++, still thinking tight and efficient code mattered to someone other than ourselves. We struggled to learn MFC's quirks and eventually began to fondly recall the exquisite and deft code used to circumvent the library's limitations, or as we put it, extend it's usefulness. And then came .Net and cookie-cutter applications. Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before - those who could write 100,000 line programs from scratch with nothing more than a few hastily scratched verses on a post-it note. I'm a relic. I like the old days. I like the old ways. There. I've said it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001Well, coldfusion is in some ways like interpreted basic (which I used even before Fortran) so in a sense we've come full circle. I do hate those 12 year olds who can hand your posterior to you however. ;)
codewizard
-
I remember fondly the heyday of computer programmers. We were a curious mix of wizards and gods, silently tapping away at keyboards, shunning those new-fangled mouse things as long as possible. We were cowboys, outlaws, and warrior poets weaving titanic tales of bytes and opcodes, roaming the electronic frontier during the burgeoning era of personal computers, free to do as we pleased, and answering only to our peers. We could cram amazing amounts of code into just 4K of memory because we knew assembly language and we knew the value of just a single byte of memory. We fed off the tit of mother COBOL, and her evil cousin, Fortran, and we praised Pascal for it's type safety, and sheer elegance. We dabbled fearlessly in LISP, mastered the DOS commandline, knew the difference between extended and expanded memory, and decided early on that Windows was Hell incarnate. We taught ourselves C and then C++, still thinking tight and efficient code mattered to someone other than ourselves. We struggled to learn MFC's quirks and eventually began to fondly recall the exquisite and deft code used to circumvent the library's limitations, or as we put it, extend it's usefulness. And then came .Net and cookie-cutter applications. Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before - those who could write 100,000 line programs from scratch with nothing more than a few hastily scratched verses on a post-it note. I'm a relic. I like the old days. I like the old ways. There. I've said it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001Relic? Hell, you're a youngster compared to me. I just celebrated my 45th anniversary as a programmer. I started in the days of coding pads (paper), keypunch cards, and three day turn around on assemblies (compiles to you youngsters) on machines with 16k characters (IBM 1401) or 160k characters (IBM 7080). COBOL was in the birthing stage and Fortran was only used for scientific applications where I was working so we still used machine language. John Backus and Peter Nauer (of BNF fame) were still wrestling with the notion of meta-languages and the great hope for the future was ALGOL then PL/1. Computers were massive beasts weighing several tons and required huge air conditioning systems to dissapate the heat. The primary storage medium was magnetic tape on 2400' reels. We had 24 drives on the IBM 7080. The first disk drive I used was an IBM 1405 RAMAC (second generation disk) with 10 million characters. It was 5' x 4' x 4', had two read/write heads (that's right 2 that moved up/down, in/out). Max seek time was over 2 seconds. Of course, the 1401 cycle times were measured in milliseconds! Everything was run in batch mode over night so we coded by day and were on-call every night to solve problems that might occur. I'm still an active programmer and I much prefer the development environment I have now. I have no desire to return to "the good old days".
-
I remember fondly the heyday of computer programmers. We were a curious mix of wizards and gods, silently tapping away at keyboards, shunning those new-fangled mouse things as long as possible. We were cowboys, outlaws, and warrior poets weaving titanic tales of bytes and opcodes, roaming the electronic frontier during the burgeoning era of personal computers, free to do as we pleased, and answering only to our peers. We could cram amazing amounts of code into just 4K of memory because we knew assembly language and we knew the value of just a single byte of memory. We fed off the tit of mother COBOL, and her evil cousin, Fortran, and we praised Pascal for it's type safety, and sheer elegance. We dabbled fearlessly in LISP, mastered the DOS commandline, knew the difference between extended and expanded memory, and decided early on that Windows was Hell incarnate. We taught ourselves C and then C++, still thinking tight and efficient code mattered to someone other than ourselves. We struggled to learn MFC's quirks and eventually began to fondly recall the exquisite and deft code used to circumvent the library's limitations, or as we put it, extend it's usefulness. And then came .Net and cookie-cutter applications. Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before - those who could write 100,000 line programs from scratch with nothing more than a few hastily scratched verses on a post-it note. I'm a relic. I like the old days. I like the old ways. There. I've said it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001 -
I remember fondly the heyday of computer programmers. We were a curious mix of wizards and gods, silently tapping away at keyboards, shunning those new-fangled mouse things as long as possible. We were cowboys, outlaws, and warrior poets weaving titanic tales of bytes and opcodes, roaming the electronic frontier during the burgeoning era of personal computers, free to do as we pleased, and answering only to our peers. We could cram amazing amounts of code into just 4K of memory because we knew assembly language and we knew the value of just a single byte of memory. We fed off the tit of mother COBOL, and her evil cousin, Fortran, and we praised Pascal for it's type safety, and sheer elegance. We dabbled fearlessly in LISP, mastered the DOS commandline, knew the difference between extended and expanded memory, and decided early on that Windows was Hell incarnate. We taught ourselves C and then C++, still thinking tight and efficient code mattered to someone other than ourselves. We struggled to learn MFC's quirks and eventually began to fondly recall the exquisite and deft code used to circumvent the library's limitations, or as we put it, extend it's usefulness. And then came .Net and cookie-cutter applications. Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before - those who could write 100,000 line programs from scratch with nothing more than a few hastily scratched verses on a post-it note. I'm a relic. I like the old days. I like the old ways. There. I've said it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001The difference between your programming peek and mine is that a few scribbles on a post-it note would be just a few lines of code from scratch and not your unmanageable 100,000 lines. Embrace the future and the product of your hard work. You should be happy to see younger programmers emerging and their idea will mature with them.
-
I remember fondly the heyday of computer programmers. We were a curious mix of wizards and gods, silently tapping away at keyboards, shunning those new-fangled mouse things as long as possible. We were cowboys, outlaws, and warrior poets weaving titanic tales of bytes and opcodes, roaming the electronic frontier during the burgeoning era of personal computers, free to do as we pleased, and answering only to our peers. We could cram amazing amounts of code into just 4K of memory because we knew assembly language and we knew the value of just a single byte of memory. We fed off the tit of mother COBOL, and her evil cousin, Fortran, and we praised Pascal for it's type safety, and sheer elegance. We dabbled fearlessly in LISP, mastered the DOS commandline, knew the difference between extended and expanded memory, and decided early on that Windows was Hell incarnate. We taught ourselves C and then C++, still thinking tight and efficient code mattered to someone other than ourselves. We struggled to learn MFC's quirks and eventually began to fondly recall the exquisite and deft code used to circumvent the library's limitations, or as we put it, extend it's usefulness. And then came .Net and cookie-cutter applications. Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before - those who could write 100,000 line programs from scratch with nothing more than a few hastily scratched verses on a post-it note. I'm a relic. I like the old days. I like the old ways. There. I've said it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001I remember at the first time I saw a mouse(the computer kind) being disappointed that it didn't look more like a mouse(the rodent kind). I was attending a user group in Stamford in the late 70's and there was an Apple Lisa on display. I soon realized that I was there for window dressing (the old-fashioned kind) among the mostly middle-aged men. And, yes, I learned how to keypunch the most code on a Hollerith card possible. No mention on trying to decipher an IBM manual. MSDN Library is clarity itself in comparison :)
Marianne G. C. Seggerman This above all, to thine own self be true and it follows as the night the day thou canst not then be false to any man.
-
I remember fondly the heyday of computer programmers. We were a curious mix of wizards and gods, silently tapping away at keyboards, shunning those new-fangled mouse things as long as possible. We were cowboys, outlaws, and warrior poets weaving titanic tales of bytes and opcodes, roaming the electronic frontier during the burgeoning era of personal computers, free to do as we pleased, and answering only to our peers. We could cram amazing amounts of code into just 4K of memory because we knew assembly language and we knew the value of just a single byte of memory. We fed off the tit of mother COBOL, and her evil cousin, Fortran, and we praised Pascal for it's type safety, and sheer elegance. We dabbled fearlessly in LISP, mastered the DOS commandline, knew the difference between extended and expanded memory, and decided early on that Windows was Hell incarnate. We taught ourselves C and then C++, still thinking tight and efficient code mattered to someone other than ourselves. We struggled to learn MFC's quirks and eventually began to fondly recall the exquisite and deft code used to circumvent the library's limitations, or as we put it, extend it's usefulness. And then came .Net and cookie-cutter applications. Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before - those who could write 100,000 line programs from scratch with nothing more than a few hastily scratched verses on a post-it note. I'm a relic. I like the old days. I like the old ways. There. I've said it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001Me, too. I typed my first computer program on a keypunch machine. Last time I used my knowledge of keypunch was when I figured out that the holes on my Federal Income Tax Refund check were my husband's and my social security numbers. I do prefer the multi-colored Windows monitors to the old green letters on gray terminals. Wouldn't go back to them if you paid me!
-
I remember fondly the heyday of computer programmers. We were a curious mix of wizards and gods, silently tapping away at keyboards, shunning those new-fangled mouse things as long as possible. We were cowboys, outlaws, and warrior poets weaving titanic tales of bytes and opcodes, roaming the electronic frontier during the burgeoning era of personal computers, free to do as we pleased, and answering only to our peers. We could cram amazing amounts of code into just 4K of memory because we knew assembly language and we knew the value of just a single byte of memory. We fed off the tit of mother COBOL, and her evil cousin, Fortran, and we praised Pascal for it's type safety, and sheer elegance. We dabbled fearlessly in LISP, mastered the DOS commandline, knew the difference between extended and expanded memory, and decided early on that Windows was Hell incarnate. We taught ourselves C and then C++, still thinking tight and efficient code mattered to someone other than ourselves. We struggled to learn MFC's quirks and eventually began to fondly recall the exquisite and deft code used to circumvent the library's limitations, or as we put it, extend it's usefulness. And then came .Net and cookie-cutter applications. Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before - those who could write 100,000 line programs from scratch with nothing more than a few hastily scratched verses on a post-it note. I'm a relic. I like the old days. I like the old ways. There. I've said it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001I am nearing the traditional retirement age. I remember learning the IBM 1620 totally on my own. My college’s 1620 had 40,000 digits of core. The 1620 was sometimes called CADET (Can’t Add Doesn’t Even Try) because memory locations (decimal) 001xy needed to contain the result of x+y (the digit “flag” held carry.) Multiply tables used locations 00200 to 00399. A NOP took 160 uSec. All programming done with tab cards – often called IBM cards. Several jobs later I did OS CDC 6000 series mainframe OS code. The max core in the mainframe was just under 1 megabyte and it would have been an entire MB if words had been 64 bits wide rather than 60 bits wide. The I/O processors were 4Kx12-bit. Much of the OS ran in these. Still using cards and batch mode. Interactive development was just starting to be almost useful. 640K limit? Gimmie a break. 6000 series I/O processors had 6K. (But characters were 6 bits wide. Perhaps that made it an 8K machine.) Later I did a bunch with the 8085/Z80 microcontrollers. Max memory 64K for RAM and ROM. And that was 64K total not each. I used CP/M and (are you ready for this?) CP/M-86. I even got my company to buy WordStar for CP/M-86. I did the 8085 firmware for the Fulcrum Computer Products OmniDisk. It deblocked physical sectors into the 128-byte sectors CP/M used. CDC decided to kill SCOPE (the OS with 75% of the user community) in favor of the mainframe OS developed at the corporate HQ in Minneapolis. I moved on to a company developing a terabit storage application using reels of 2-inch wide videotape. The tape in those reels was a mile long. Multiple tape transports were needed to get to a terabit. We copied tape blocks to 3330 disk packs with only the core in a PDP-11 system. We had to do it on the fly because it would cost too much to have a half a dozen 64K blocks of buffer storage. Today, a terabyte (125MB) disk drive is almost too small for many personal computers. I have had some fun with 30yo engineers telling them how core memory worked. They know all the required physics but have not a clue how ferrite beads could be configured into a memory module. ‘Nuff reminiscing using WinWord in a (32-bit version of XP) running on a 64-bit CPU with a gig of RAM. Classical guitar music mp3 files are playing in the background. I’m in the Silicon Valley. Anyone needing a software generalist please visit hmtown.com
-
I remember fondly the heyday of computer programmers. We were a curious mix of wizards and gods, silently tapping away at keyboards, shunning those new-fangled mouse things as long as possible. We were cowboys, outlaws, and warrior poets weaving titanic tales of bytes and opcodes, roaming the electronic frontier during the burgeoning era of personal computers, free to do as we pleased, and answering only to our peers. We could cram amazing amounts of code into just 4K of memory because we knew assembly language and we knew the value of just a single byte of memory. We fed off the tit of mother COBOL, and her evil cousin, Fortran, and we praised Pascal for it's type safety, and sheer elegance. We dabbled fearlessly in LISP, mastered the DOS commandline, knew the difference between extended and expanded memory, and decided early on that Windows was Hell incarnate. We taught ourselves C and then C++, still thinking tight and efficient code mattered to someone other than ourselves. We struggled to learn MFC's quirks and eventually began to fondly recall the exquisite and deft code used to circumvent the library's limitations, or as we put it, extend it's usefulness. And then came .Net and cookie-cutter applications. Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before - those who could write 100,000 line programs from scratch with nothing more than a few hastily scratched verses on a post-it note. I'm a relic. I like the old days. I like the old ways. There. I've said it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001John Simmons / outlaw programmer wrote:
Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before
I would'nt say that about all the young programmers. When I first started programming all I had was an old Pentium 100 with Windows NT 4.0. I did not have any development software or compilers at the time so I was stuck with QBASIC 1.1. I still loved it though. Then I finally got a C compiler. It was one of Borlands old compilers that ran on DOS. All the BASIC C and C++ programming I have done was on DOS, that was all I had. I tried to write the smallest most efficient programs I could possibly write. I even knew how to program the VGA card and interact directly with the keyboard in DOS. I started to learn x86 assembler in Windows and I tried really hard to make Windows programs using assembler. MFC was another thing that I partially learned, I hate MFC, it just seems poorly designed to me. When I discovered C# and .NET I was able to do so much with it. It have me power that is almost comparable to C++ and I had a well designed API to work with(.NET). C# lets me do almost all the things C++ lets me do, it give me pointers, I can use Win32/64 APIs, generic programming, and things that C++ does not have such as properties, parameterfull properties (aka indexers), interfaces, and a clean object model that just makes sense. And over time we will continue to see more features of C# and .NET such as LINQ and more speed improvements with the JIT, garbage collector, and a better optimizing compiler.
█▒▒▒▒▒██▒█▒██ █▒█████▒▒▒▒▒█ █▒██████▒█▒██ █▒█████▒▒▒▒▒█ █▒▒▒▒▒██▒█▒██
-
Steve Echols wrote:
microsoft's M.exe editor
I still remember Edlin. Hey!! My fingers also remembers how to type it fast enough! :laugh:
"It's supposed to be hard, otherwise anybody could do it!" - selfquote
"No one remembers a coward!" - Jan Elfström 1998
"...but everyone remembers an idiot!" - my lawyer 2005 when heard of Jan's saying aboveI still sometime finding myself typing "vi" editor commands even in .NET's editor window. It was hell to learn and near the ulitmate in counterintuitive. But, it had two things going for it. 1) Once you did learn it, you could code/edit extremely fast. 2) It was the best universal editor for Unix/Linux. (Now to all those Emacs fans out there, I have to admit I never learned it because it was more complex, but more so because not every Unix variant or installation had it installed while I could always count on vi being there. And, when using X-windows or a good terminal emulator, you could cut and paste between files just like on Windows. --------- On a similar, slightly related note. I've always been puzzled that computers are about 1000 times faster than they were 25 years ago, yet Word 2003 doesn't work any faster than WordPerfect 5.l for DOS did.
Andrew C. Eisenberg Nashville, TN, USA (a.k.a. Music City USA) (Yes Virginia, there are rock and roll stations in Nashville! :laugh:)
-
WordStar, heh, heh, I remember the CPM/MPM days....
............................. There's nothing like the sound of incoming rifle and mortar rounds to cure the blues. No matter how down you are, you take an active and immediate interest in life. Fiat justitia, et ruat cælum
Robert M Greene wrote:
WordStar, heh, heh, I remember the CPM/MPM days....
Heck then you should remember the baby blue processor! I was in heaven when we finally had a version of wordstar on the PC. I remember sending them a driver for the HP LaserJet when it first came out to swap for free copies of the software. I still carry it around in my personal BIN though I guess I haven't used it in a couple of years. I think now adays there is just too bloody much chaos to focus or things have too many elements to even know where to begin. Peole understood the zen of having peace and quite to focus on a solution.
-
I remember fondly the heyday of computer programmers. We were a curious mix of wizards and gods, silently tapping away at keyboards, shunning those new-fangled mouse things as long as possible. We were cowboys, outlaws, and warrior poets weaving titanic tales of bytes and opcodes, roaming the electronic frontier during the burgeoning era of personal computers, free to do as we pleased, and answering only to our peers. We could cram amazing amounts of code into just 4K of memory because we knew assembly language and we knew the value of just a single byte of memory. We fed off the tit of mother COBOL, and her evil cousin, Fortran, and we praised Pascal for it's type safety, and sheer elegance. We dabbled fearlessly in LISP, mastered the DOS commandline, knew the difference between extended and expanded memory, and decided early on that Windows was Hell incarnate. We taught ourselves C and then C++, still thinking tight and efficient code mattered to someone other than ourselves. We struggled to learn MFC's quirks and eventually began to fondly recall the exquisite and deft code used to circumvent the library's limitations, or as we put it, extend it's usefulness. And then came .Net and cookie-cutter applications. Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before - those who could write 100,000 line programs from scratch with nothing more than a few hastily scratched verses on a post-it note. I'm a relic. I like the old days. I like the old ways. There. I've said it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001How True. It was pure heaven when I was successful in compiling a C program on a Z80/CP/M machine with only two 5 1/4" floppy drives and 64KB RAM.:). It was so much fun trying to save a few bytes in the assembly language program. Somehow, I feel that kind of challenge is missing now. I now feel like laughing at myself when I recollect the arguments that I used to make saying that Windows is nothing but a SHELL around DOS. I evny the current set of software artists who are able to make a PC do so much without all the grinding that we had to go through. ganesh kalmane
-
I remember fondly the heyday of computer programmers. We were a curious mix of wizards and gods, silently tapping away at keyboards, shunning those new-fangled mouse things as long as possible. We were cowboys, outlaws, and warrior poets weaving titanic tales of bytes and opcodes, roaming the electronic frontier during the burgeoning era of personal computers, free to do as we pleased, and answering only to our peers. We could cram amazing amounts of code into just 4K of memory because we knew assembly language and we knew the value of just a single byte of memory. We fed off the tit of mother COBOL, and her evil cousin, Fortran, and we praised Pascal for it's type safety, and sheer elegance. We dabbled fearlessly in LISP, mastered the DOS commandline, knew the difference between extended and expanded memory, and decided early on that Windows was Hell incarnate. We taught ourselves C and then C++, still thinking tight and efficient code mattered to someone other than ourselves. We struggled to learn MFC's quirks and eventually began to fondly recall the exquisite and deft code used to circumvent the library's limitations, or as we put it, extend it's usefulness. And then came .Net and cookie-cutter applications. Suddenly we were thrust into the maelstrom of "me-too" programming, populated by 12-year olds who believe that the OS should be web-based, and that have no awareness nor respect for those who came before - those who could write 100,000 line programs from scratch with nothing more than a few hastily scratched verses on a post-it note. I'm a relic. I like the old days. I like the old ways. There. I've said it.
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass..." - Dale Earnhardt, 1997
-----
"...the staggering layers of obscenity in your statement make it a work of art on so many levels." - Jason Jystad, 10/26/2001Well said, John! Raise a cup to $59 development systems and OSs that cost less than the hardware they slow down.:beer:
"...a photo album is like Life, but flat and stuck to pages." - Shog9