I think my favorite quotation about C is, "C is memory with syntactic sugar." AFAIK it is original to Dennis Kubes and first appears here: http://denniskubes.com/2013/04/23/how-to-think-about-variables-in-c/[^] The nice thing about C is you can almost always do whatever it is you want to do in C. The downside is, you can almost always do whatever it is you want to do an order of magnitude more easily in something else, even if that "something else" is C++. Nevertheless, that "something else" is almost universally just an interface, in some form or fashion, to C or C++ code.
Reese Currie
Posts
-
C is a better language than any language you care to name. -
Does anyone code with LISP?I program in Emacs Lisp (ELisp) all the time, and love Lisp in that application, so much that I would like to use Lisp for other things. However, I've only used Common Lisp a little bit, for very small things, and almost always opt for something else when I have something that needs to get done. I think the difference is the "atoms". The atoms in Emacs Lisp are geared for text processing and it makes Emacs a very effective text processing framework, for me, anyway. Apparently only 1% of Emacs users actually write ELisp. Common Lisp I guess just doesn't appeal to me as a general purpose language, because in my limited experience with it, the atoms just don't do it for me. Maybe if I gave it a harder push, but in the presence of other options, I tend toward the other options. I would not discourage you from learning Lisp, because it changes how you think about programming and yes, will make you a better programmer in any language for the experience. If you google Paul Graham, he provides some advantages of Lisp; he used it to write I believe Yahoo's Shopping Cart. A more interesting story for me can be found if you google "Lisping at JPL".
-
Ruby or PHP??Good question! I guess it depends mostly on what I didn't know at the time. My first self-chosen language to learn was C, because it wasn't part of the curriculum in my school and inarguably a very important language to know at the time, and yes, I found a lot of the low-level stuff you could do with it quite interesting. Of course I also learned C++ and liked it very much as well, but frankly I haven't kept up with C++ as it has gotten more convoluted feature rich. The very name of the most recent C++ is so complex I can't remember it for a certainty off-hand. Lisp was a standout, because it is such a departure from the languages I normally use that I found it very interesting to work with. It was sort of like exercise, making my brain "bend" in a way it didn't normally "bend." Most of my Lisp code hit the recycle bin, but I now use Emacs in part because of the ability to add a little text manipulation in Lisp when I need to. I have a pocketful of Lisp utilities I've written for Emacs that come in amazingly handy when I program for batch processing (which has been my bread and butter on project after project for the past 16 years). The mixture of OOP and functional features like lambdas and so on made me enthusiastic to learn Python, and Python has served me extremely well and very frequently--it is one of the best languages in existence, in my opinion. The frustration with it is the incredibly slow adoption of Python 3, and the incompatibility of 2 and 3 code. That may be the death blow to an otherwise excellent language. Not every experiment ends in success. For some reason I can't work in Prolog--my brain apparently doesn't bend that way; either that or I didn't pursue it long enough to have the Eureka moment. I guess I couldn't get my mind around describing my problem and letting the computer solve it. The most recent one I learned by my own choice was Scala; it's quite good but the documentation all tends to be trying to sell you on the idea of Scala rather than answering your questions, not totally surprising since the source is academia. It's a more pleasant-to-use alternative for almost anything you'd do in Java. Given that I found Lisp so interesting, I probably should learn Clojure, and I have a bookmark for that, but more bookmarks than enthusiasm, I guess, and how many JVM languages or Lisp dialects does one really need? If I worked routinely in .NET, I would look at F# in detail. Ironically, to the original question, I chose to look at PHP over
-
Ruby or PHP??Allow me to interject a little personal philosophy. I’ve worked in fifteen languages, not counting “shells” as languages or non-Turing complete things that are called languages (e.g. XML, HTML). You can only be “really good” in maybe five to seven of them, and a good number of those fifteen I hope to never lay eyes on again. Their entry to my skillset came from one of four sources: personal choice, academic dictate, project dictate and vendor dictate. Academic dictates are those languages you have to learn in college. Project dictates are those languages that you are forced to learn by the projects you work on. Vendor dictates are languages you have to learn to work with a vendor’s product; for instance if you work in Oracle databases you’ll need to know SQL and probably PL/SQL. The languages that are dictated to you are almost always unsatisfactory intellectually, because teams are intellectually diverse, and tools need to be chosen on projects for the least common denominator, or to put it more bluntly, the lowest IQ, the least motivated to learn new things, the least capable of grasping abstract concepts. Therefore my recommendation for personal choices is to choose languages that you are unlikely to have dictated to you by any other means. Go for the languages that most pique your interest. It is preferable to learn languages that help you expand your problem solving skills. Working in imperative languages? Try a functional language so you can broaden your approaches to working on problems. Working in compiled languages? Try scripting languages. The core idea is to broaden your range and breadth of knowledge compared to your colleagues. The most interesting languages I know have all been personal choices. I’ve practically never had an interesting language dictated to me by academics, projects or vendors. Of course I’ve subsequently been on projects using some of those personally interesting languages, so there is naturally some intersection between personal choices and project work. So the answer to Ruby or PHP is, which one interests YOU the most, period. Forget about what’s more marketable or whatever other criteria you might have, or what language Joe Blow thinks you should learn instead. It’s not like your brain can’t hold more than one new language, especially if you get in the habit of learning new languages every year or two (or three at the worst).
-
My first language and interesting early software projects.Your story reminds me of mine. I also started with a TRS-80 "Color Computer 2" and Microsoft's BASIC, to which I added 6809 Assembler and then, C (which I had to get OS-9 to obtain). The problem was, OS-9's line editor was horrific; I couldn't stand working with it. So the first thing I did was write a line editor to replace it that worked more like the one on NCR ITX. (If anyone has used ITX's line editor, they know two things; a) my goal wasn't overly ambitious and b) OS-9's line editor must have been really bad, for an ITX style editor to be better.) On the plus side: I got to learn about pointers and doubly-linked lists and so forth which was a great start on C. It was also a very neat experience to spend a few hours in OS-9's editor writing just enough to get my own editor started, and then code the rest of it in my own editor. On the minus side, the text editor compiled to a little over 10K which left something like 5K left for editing! So setting up an editor I could stand ended up being the last significant thing I did on OS-9 because there wasn't enough memory space to do more. :laugh: I remember sitting down one day to write an enhancement to swap lines between disk and memory, and deciding that it just wasn't worth doing for a computer that connected to a TV.
-
After so many hacks, why won't Java just go away?I was thinking the same thing, although if one had said, "After so many hacks, why won't Windows just go away", there probably would have been a lot more sentiment defending Windows. I think Java's probably easier to like the less GUI work you do with it, but that might be my own bias :).
-
Wondering about F#Gave this a 5--great answer, Bob. I like how you fit F# into your process. I've tried and liked F#, but working more with the JVM and very rarely with .NET, I do more Scala. (.NET folks are welcome to send flowers of sympathy.) Scala is a lot nicer to work with than Java. I wanted to note there's a way to "simulate" tail recursion in Scala using "@tailrec" but simulation's not exactly the same thing; @tailrec is a bit of compile-time smoke and mirrors. At least it means you can write tail recursive code without worrying about blowing your call stack, and the performance "isn't that bad" on things I've written that rely on tail recursion. I think F# long-term will be more successful because Microsoft is interested in people using it. Oracle doesn't really encourage you to give Scala a try, and most of the writing on Scala is so academic in nature it really takes endurance to wade through it all to the point of doing something practical with it.
-
The first descent code I wrote in my lifeAwesome! I went out with one of these girls' sisters for a while, but it was unrelated to having provided code ... :laugh:
-
The first descent code I wrote in my lifeMy story is from a long time ago--probably 1989. I'd been hired by a small consulting shop as a COBOL programmer. My favorite language, however, was C and I worked it into my job as frequently as I could. In these days before Perl and Python, all of my quick knock-off "scripts" were C programs. My day-to-day working computer was a Unisys 6000/30 running CTIX, Unisys' version of UNIX System V, and the accounting software. I was connected to it with a dumb terminal; others were connected by dumb terminals or PC's over serial cables. The company charged meticulously for time spent consulting, including phone time. To make tracking that time easier, the boss had installed a phone system that tracked time on incoming calls, and when calls completed, sent the details to a serial printer. The trouble was, the paper would bind up in that printer all the time and calls would end up going uncharged. One day the boss was extra ticked off because the paper had been jammed up for days and no one had noticed. Hm, I thought; it's a serial interface, and the printer was only maybe 8 feet closer to the phone box than the Unisys was, and I had three serial ports free. How about a daemon to poll the serial port and put the results in a file, and eliminate the serial printer? I proposed the idea to the boss and he allowed me to work on it on a low priority basis when client demands allowed. So I figured out how to configure and read serial ports from C, and had my serial printer substitute available in two or three days. The boss and the girls in accounting were extremely happy about it. They requested me to write some utilities to look up particular phone numbers in the file and so forth, all of which I did in C. The cool part, though, was interfacing to the printer, back in a day when you couldn't just look up how to do things on the Net, but had to actually read the voluminous manuals that came with your system and piece the concepts together yourself.
-
Try/catch block...FTSOTR, that does compile in Java (with a little syntactic fix) and returns 42.
-
iScreen = iScreen < 8 ? iScream : ...The code quality is reminiscent of the legacy code I used to maintain that was written in the 70's and early 80's, before Ed Yourdon et. al. started talking about structured programming, which is itself a very old paradigm. I count a cyclomatic complexity of 51 (academia says 10 is "maintainable", really 18 or so if you stretch it) and it's full of "magic numbers" that have no obvious meaning other than they appear to be "screen numbers". It is kind of depressing to see that kind of code being written in a modern language, 30 years after the first efforts were made to educate people out of these kinds of practices.
-
Am I Being Bullied by MicrosoftYou're not being bullied, you're just suffering from a problem endemic to proprietary software: your fortunes are at the whim of your vendor. If a product isn't making the vendor enough money, it can be suddenly terminated. For this reason I've always found it very risky to support any development technology that is proprietary regardless of the vendor. The risk of having to rewrite everything on the event of product termination is far too high. I prefer languages that are either covered by an industry standard or are open source. With regard to Microsoft tools, the CLI (Common Language Infrastructure), C#, and C++/CLI are standardized by ECMA. C and C++ are standardized, formerly by ANSI, now by ISO. (One caveat, I believe Microsoft has stopped keeping C fully up to date with the standard.) Because these ones are covered by industry standards bodies, these are the only Microsoft development tools I would consider completely safe to use, which more or less echoes the recommendations of others for different reasons. This isn't really a matter of "better/fewer problems" but proprietary or non-proprietary. My advice is, steer clear of proprietary.
-
A more realistic tech-movie...Now, that's some elegant code, right there!
-
Choosing VCS for Single Developer, Small Projects, Two PC's, Two LocationsMike, I've used git and Mercurial to do this successfully. For you as a hobbyist I'd recommend Mercurial over git simply because it's more straightforward to use and has a shorter learning curve. I have to say I liked git a little better than Mercurial for this, but the learning curve is hard to justify if version control isn't your main focus. My model for doing it was having a master repository on one machine and a cloned repository on another, and taking Mercurial patches back and forth as e-mail attachments, on USB, or whatever. This was necessary because the content was work-related and my company does not permit using BitBucket, GitHub or any other cloud repository. If you don't want anyone to possibly see what you're working on, then do something similar to my way, otherwise the suggestions given about using a cloud service are the easiest way to go. Avoid SVN for this task. I manage SVN repositories at work and it is a great tool as a central repository, but it is not a distributed system and it isn't nice about being shoehorned into that role. There is a product called SVK for SVN-style distributed processing but I haven't tried it, because it wouldn't install on Ubuntu because it apparently relies on deprecated Perl libraries. It installs on Windows just fine.
-
Coding Challenge Of The DayOK, I admit it isn't very small. Scala is number 46 on the Tiobe Index this month, so perhaps it is obscure enough to qualify. Being a "functional" language there are actually 3 functions but I nested them in one.
def convertRomanToArabic(romanNumeral: String): Int = { var previous = 0 def convertSingleRomanNumeral(numeral: Char): Int = { numeral match { case 'I' => 1 case 'V' => 5 case 'X' => 10 case 'L' => 50 case 'C' => 100 case 'D' => 500 case 'M' => 1000 } } def addRomans(next: Int, accumulator: Int): Int = { if (previous == 0) previous = accumulator var addto = if (previous > next) next \* -1 else next previous = next accumulator + addto } val values = romanNumeral.toList.map(n => convertSingleRomanNumeral(n)) values.reduceRight(addRomans) }