Favorite way to categorize programming languages?
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
The good, the bad and the ugly. Ugly) The languages that my colleagues prefer that I don't like Bad) Languages that neither my colleagues or I use good) The languages I use.
Nothing succeeds like a budgie without teeth.
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
Honestly mostly I just think in terms of whether or not a given language is suitable for my current task.
-
Well, APL is quite readable to mathematicians. (Or at least a certain share.) APL was not developed as a programming language at all, but as a notation for teaching the math of matrices. Kenneth Iverson used it as "blackboard language" for a number of years when lecturing at Harvard in the late 1950s. It wasn't until he quit teaching and moved to IBM some of his workmates said something like "Hey, we could make a computer do these operations!" But I must admit that you are right: APL is definitely not a language that you will learn in two weeks. And you must keep on using it, otherwise you will forget so much that you will have to dig up your old APL textbook to find any sense in the APL programs you wrote yourself a few years ago. My first programming language was Basic - a very basic one, with at most 286 variables named A-Z or A0-A9 to Z0-Z9, and 26 string variables A$-Z$. My next language was APL, and I was extremely impressed by its power, compared to Basic. (I was 17 at the time.) I still have my APL textbook, and I'd love to pick it up again - just for doing programming that is very different from at least 95% of all programming done today. (I just need a tuit.)
As an applied math major at an engineering-focused school back in the 80s I actually had APL as a required 1-credit course. I kind of enjoyed it *because* it was a little arcane.
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
1)$$$ 2)$$ 3)$ 4)Nada
-
In my opinion: No. Typing should be explicitly visible in the program text, and clearly identified as a type. Polymorphism, through subclasses, is OK. You can force run time type errors through casting, but casting is explicit. As pointed out: No language is absolutely bound to being interpreted or compiled, but strict typing leans quite strongly towards a complete parsing of the source code before execution starts. When you do that, why not go all the way and generate the code? So strong typing leans towards compilation rather than interpretation, although not by definition.
-
Quote:
Typing should be explicitly visible in the program text, and clearly identified as a type.
I agree that it should be, but I do not believe this is a requirement for a language to be strongly-typed. Specifically, c++ `auto` and c# `var` break this rule but both are strongly typed just because misusing an object is likely to result in a compiler error.
var is still strongly typed, it is simply syntactic sugar so we can write and read the code more fluently. For ALL intents and purposes it represents a strong type reference that must be pre-compiled before execution.
-
var is still strongly typed, it is simply syntactic sugar so we can write and read the code more fluently. For ALL intents and purposes it represents a strong type reference that must be pre-compiled before execution.
-
var is still strongly typed, it is simply syntactic sugar so we can write and read the code more fluently. For ALL intents and purposes it represents a strong type reference that must be pre-compiled before execution.
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
-
Quote:
Typing should be explicitly visible in the program text, and clearly identified as a type.
I agree that it should be, but I do not believe this is a requirement for a language to be strongly-typed. Specifically, c++ `auto` and c# `var` break this rule but both are strongly typed just because misusing an object is likely to result in a compiler error.
-
I'd say C/C++ are weakly typed because you can always convert one type into random other type. Java/C# only allow limited conversions between related types.
Imo the difference in that respect between c++ and c# is part of the broader difference that C#'s runtime second-guesses your every instruction while c++ takes your word for it. In both cases, the cast makes it past the syntax check with the same meaning: "trust me, these are the same". The exception is when C#/java will fail validation if the known strong-type cannot possibly also be an instance of the casted type; but that is not applicable to c++ because of multiple-inheritance, an object could always exist that inherits from both of them. C#'s second-guessing compares "the actual type" via reflection to the casted type, whereas most if not all c/c++ programs at runtime make no definition of type and no support for reflection; that's a runtime distinction, not part of the language for the same reason that compiled vs interpreted is not part of the language. As far as if the language itself is strongly typed goes, they are the same. A c++ compiler and runtime could be invented that does the same as C# without changes to the language itself, or vice versa. I believe such a c++ compiler could even be compliant, with enough effort. The same applies to accessing an array with an invalid index, c# throws an error right away, because it first checked that the index you gave it matched what it knows to be valid, whereas c++ will trust you and perform the operation, likely resulting in a subsequent error (because the cpu second-guesses you at a more security-oriented level i.e., DEP).
-
I would divide general purpose languages into categories (not all fitting languages are listed): 1. Assemblers 2. C, C++ 3. Java, C# 4. Python, JavaScript 5. Perl, TCL Anything that is not like the 5 categories above is not worth categorizing.
To me, that is two different languages: You've got (several dialects of) algorithmic language. And then there are those "not worth categorizing". A programming language is usually a syntax, but much more a way of thinking of a problem solution. There is little difference in the way you break down the problem and model in a solution between C and Pascal, and for that sake assembler. Lisp is a completely different way of attacking it. APL yet another way. Prolog resembles none of them (some people may see resemblances at the abstract level between Prolog and Lisp, but on the very abstract level). I haven't been asked for which languages I know for many years. In those days when languages were really developing, and new concepts arrived quite often (maybe after having been discussed in academica for years before arriving in the programming marketplace), I used to answer that I know: * Algorithmic, with dialects like Fortran, Pascal, Basic, C whatever, Java, Python, CHILL, asseblers ... * Array & workspace: APL (Smalltalk is workspace, too, but I never used that) * Predicate: Prolog, SNOBOL, XSLT, regex. Maybe SQL fits into this group. * List & functional: Lisp * Job scripting: .sh and all its variants, .bat files, lots of others. * Data definition languages: ASN.1, XAML, the DDL part of SQL, XML schema languages (several), JSON, ... I am somewhat tempted to add: * Event driven: Win core API, OSI communication protocols. even though is not a language in the syntax sense, but certainly a quite different way of programming, and thinking about program design, compared to the monolithic, single thread from-start-to-end style of C and Pascal. The Win core model also has significant elements of workspace philosophy. I believe Erlang also comes in the event driven class, but I never went much beyond 'the Hello World level' when I had a short glimpse on Erlang many years ago. There is a question of where to draw the line for what is "languages". Is regex a language? XML/html/TeX/SGML/...? General macro programming languages? 'Macro' programming internal to one specific application? The programming language of the HP 41C calculator? I think it is a pity that we today try to mold force all sorts of programming into the C style of algorithmic thinking. That thinking pattern is all that most young programmers know of; they never consider a predicate approach, a list approach, a workspace solution model. You see some small traces of other elements, e.g. a regex of half a line -
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
-
easy to install and execute simple code programming language vs do obstacle course, restart computer, nope wrong version for this specific platform, you need to recompile with the argument nag, missing manifest file programming language
I'd say that, too, is not really the language's fault so much as the compiler, the standard libs, the community that uses it and even the specific project. There's nothing stopping a compiler from being invented that can work in-place with no environment configuration, yet you get things like C# where you have to install an ide to get the compiler; or else you get a project with so many dependencies the odds of everything just working are low. That's about the fifth time I've said that on this thread. Conclusion: we don't need better/more languages, we need better compilers.
We don't need better languages or more languages, we need better compilers and runtimes.
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
C# !C# :)
-
C# !C# :)
-
Nah, I think I will retire before then. Pretty much been doing C# for the last ten years except for a brief stint of C++ for a year. I'm 64 and have a couple of years of work in front of me that I have to get done, and then I think I will fade away to the beach somewhere. :laugh:
-
var is still strongly typed, it is simply syntactic sugar so we can write and read the code more fluently. For ALL intents and purposes it represents a strong type reference that must be pre-compiled before execution.
With var, we can write the code easier, but reading it in some cases is open to interpretation. It should not be necessary to look at a function call to see what it returns to glean the type returned that goes into the variable typed as 'var'.