Both Jobs and Ritchie made the world a better place. We don't need to tear one down to give the other his due. I think Ritchie would have appreciated this spoof that appeared in an April 1 issue of Computerworld many years ago: CREATORS ADMIT UNIX, C HOAX In an announcement that has stunned the computer industry, Ken Thompson, Dennis Ritchie and Brian Kernighan admitted that the Unix operating system and C programming language created by them is an elaborate April Fools prank kept alive for over 20 years. Speaking at the recent UnixWorld Software Development Forum, Thompson revealed the following: "In 1969, AT&T had just terminated their work with the GE/Honeywell/AT&T Multics project. Brian and I had just started working with an early release of Pascal from Professor Nichlaus Wirth's ETH labs in Switzerland and we were impressed with its elegant simplicity and power. Denis had just finished reading 'Bored of the Rings', a hilarious National Lampoon parody of the great Tolkien 'Lord of the Rings' trilogy. As a lark, we decided to do parodies of the Multics environment and Pascal. Dennis and I were responsible for the operating environment. We looked at Multics and designed the new system to be as complex and cryptic as possible to maximize casual users' frustration levels, calling it Unix as a parody of Multics, as well as other more risque allusions. Then Dennis and Brian worked on a truly warped version of Pascal, called 'A'. When we found others were actually trying to create real programs with A, we quickly added additional cryptic features and evolved into B, BCPL and finally C. We stopped when we got a clean compile on the following syntax: for(;P("\n"),R=;P("|"))for(e=C;e=;P("_"+(*u++/8)%2))P("| "+(*u/4)%2); To think that modern programmers would try to use a language that allowed such a statement was beyond our comprehension! We actually thought of selling this to the Soviets to set their computer science progress back 20 or more years. Imagine our surprise when AT&T and other US corporations actually began trying to use Unix and C! It has taken them 20 years to develop enough expertise to generate even marginally useful applications using this 1960's technological parody, but we are impressed with the tenacity (if not common sense) of the general Unix and C programmer. In any event, Brian, Dennis and I have been working exclusively in Pascal on the Apple Macintosh for the past few years and feel really guilty about the chaos, confusion and truly bad programming that has resulted from our
Steve Caine
Posts
-
RIP Dennis Ritchie -
The purpose of error messagesYou couldn't run fast enough. The speed of the subspace-time implosion would be (naturally) faster than the speed of light. And no point making out your will. It would be destroyed along with the rest of the planet. ;P
-
Performance GeniusJohn Simmons / outlaw programmer wrote: "Is there some subtle sarcasm going on here? I've never seen "++i" used (in a for loop) - ever." Somewhat subtle. The increment operator is slightly more efficient in its prefix version (++i) because it just increments 'i' and returns its value. The postfix version has to do more work because it returns the *previous* version of 'i', so it has to store that value somewhere before incrementing i. If you're just incrementing 'i', it makes more sense to use the more efficient prefix version. It's just better form, and encourages good habits. (For example, applying it to something more complex than an integer, say an iterator to an STL container class, might produce a more significant performance hit than the miniscule one your test found.) Yet you're right, most of the instances of '++' or' --' I've seen in code, particularly in 'for' loops, is the postfix version. There's no reason for it, yet somehow that has become a near-universal meme in programming. Perhaps because the most common example beginning programmers encounter is the classic C-string copy code snippet, where you really *do* want to use the postfix version: while (*dst++ = *src++) ; So I was making a joke that the real coding outrage in the original poster's message was using 'i++' instead of '++i', as if I had completely missed the real outrage of sleeping the thread in each pass through the 'for' loop. (Getting off my soapbox now.)
-
Performance GeniusBecause ++i doesn't have to keep track of/return the previous value of i.
-
Performance GeniusDefinitely. There's no excuse for for ( int i = 0; i < data.Count; i++ ) instead of the obvious, clearly superior and faster for ( int i = 0; i < data.Count; ++i ) ;P
-
The impermanence of our generation"Look on my works, ye Mighty, and despair!" http://en.wikipedia.org/wiki/Ozymandias[^] Why should we be any different?