Favorite way to categorize programming languages?
-
In my opinion: No. Typing should be explicitly visible in the program text, and clearly identified as a type. Polymorphism, through subclasses, is OK. You can force run time type errors through casting, but casting is explicit. As pointed out: No language is absolutely bound to being interpreted or compiled, but strict typing leans quite strongly towards a complete parsing of the source code before execution starts. When you do that, why not go all the way and generate the code? So strong typing leans towards compilation rather than interpretation, although not by definition.
Quote:
Typing should be explicitly visible in the program text, and clearly identified as a type.
I agree that it should be, but I do not believe this is a requirement for a language to be strongly-typed. Specifically, c++ `auto` and c# `var` break this rule but both are strongly typed just because misusing an object is likely to result in a compiler error.
-
and then there's APL
let max = list[0];
for (let i=0; ior the APL version
⌈/
The less you need, the more you have.
Even a blind squirrel gets a nut...occasionally.
I had never heard of APL but after looking at this and a handful of examples online I firmly place this in my type 2 category because it lacks any form of readability to anyone not very well versed, whereas most programmers who have never touched c++ can probably figure out what the c++ does. Inho those few seconds you save typing will be wasted many times over debugging the swamp of symbols.
-
Memtha wrote:
Type
Exactly. Is it strong-typed? Can it reflect on itself? Does it support function programming expressions? Does it supported anonymous types and functions? Is it compiled (the classic definition of compiled, IL included.) If yes, I like it. If somewhat yes, I tolerate it. If no to all, I won't use it. That means: C#: yes TypeScript: tolerated yes JavaScript, Ruby, Python, etc: I won't use it. Now, granted, I do like Python for certain things. There's always exceptions to the rule.
Latest Articles:
ASP.NET Core Web API: Plugin Controllers and ServicesAgreed with strong-typed and compiled. I personally cannot stand reflection because it tends to loose the benefits of strong-typed because string mangling at runtime can't really be compiler checked. Example: I am presently building (not by choice) a scheduling system that pulls the assembly name, type name and method name from sql, loads the assembly, finds the type and method and runs it. If some bonehead comes along and changes the method signature, no compiler errors will occur but the task will fail at runtime. Anonymous types and functions (and especially lambda and "properties") are perfect examples of the kind of fluff I despise in C#. It does not take that much longer to make proper named classes and functions, and is much more readable and reusable. The extra time will pay for itself later when I don't have to go to the docs to find the implied return type of my function based on what it is being passed to.
As trønderen wrote above:
Typing should be explicitly visible in the program text, and clearly identified as a type.
"
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
functional vs. imperative and general purpose vs. domain specific are my main delineations. beyond that, "interpreted" vs "static compiled" vs "JIT compiled" is technically an implementation detail. Consider Javascript - initially interpreted, now JIT compiled. It's why I don't consider that when I deal in languages For me it's about what I can do with it, not even "how" I do it. Although I do consider the form of typing (duck typing, static typing, etc) when evaluating a language.
Real programmers use butterflies
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
Does it solve my problem? Yes/No C# usually does, JavaScript occasionally does, SQL sometimes does, Regex rarely does... As a PLC programmer you're probably using C or Rust, but as a web developer those aren't really viable choices, C# and JavaScript are. Golf-oriented gibberish never solves my problem, that's more for fun and giggles. That's not to say I like all those languages equally, but I'll learn/use them when I have to.
Best, Sander Azure DevOps Succinctly (free eBook) Azure Serverless Succinctly (free eBook) Migrating Apps to the Cloud with Azure arrgh.js - Bringing LINQ to JavaScript
-
I had never heard of APL but after looking at this and a handful of examples online I firmly place this in my type 2 category because it lacks any form of readability to anyone not very well versed, whereas most programmers who have never touched c++ can probably figure out what the c++ does. Inho those few seconds you save typing will be wasted many times over debugging the swamp of symbols.
Well, APL is quite readable to mathematicians. (Or at least a certain share.) APL was not developed as a programming language at all, but as a notation for teaching the math of matrices. Kenneth Iverson used it as "blackboard language" for a number of years when lecturing at Harvard in the late 1950s. It wasn't until he quit teaching and moved to IBM some of his workmates said something like "Hey, we could make a computer do these operations!" But I must admit that you are right: APL is definitely not a language that you will learn in two weeks. And you must keep on using it, otherwise you will forget so much that you will have to dig up your old APL textbook to find any sense in the APL programs you wrote yourself a few years ago. My first programming language was Basic - a very basic one, with at most 286 variables named A-Z or A0-A9 to Z0-Z9, and 26 string variables A$-Z$. My next language was APL, and I was extremely impressed by its power, compared to Basic. (I was 17 at the time.) I still have my APL textbook, and I'd love to pick it up again - just for doing programming that is very different from at least 95% of all programming done today. (I just need a tuit.)
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
My categories are : Do I know it? Do I need to know it? Do I want to know it? Am I sufficiently not lazy to learn it?
-
My categories are : Do I know it? Do I need to know it? Do I want to know it? Am I sufficiently not lazy to learn it?
-
Well, APL is quite readable to mathematicians. (Or at least a certain share.) APL was not developed as a programming language at all, but as a notation for teaching the math of matrices. Kenneth Iverson used it as "blackboard language" for a number of years when lecturing at Harvard in the late 1950s. It wasn't until he quit teaching and moved to IBM some of his workmates said something like "Hey, we could make a computer do these operations!" But I must admit that you are right: APL is definitely not a language that you will learn in two weeks. And you must keep on using it, otherwise you will forget so much that you will have to dig up your old APL textbook to find any sense in the APL programs you wrote yourself a few years ago. My first programming language was Basic - a very basic one, with at most 286 variables named A-Z or A0-A9 to Z0-Z9, and 26 string variables A$-Z$. My next language was APL, and I was extremely impressed by its power, compared to Basic. (I was 17 at the time.) I still have my APL textbook, and I'd love to pick it up again - just for doing programming that is very different from at least 95% of all programming done today. (I just need a tuit.)
If I had the inclination to learn APL, I suspect I would end up using it like I use regex, not as a language exactly, since I don't think I'd ever write the entirety of a program in it, just a hard-coded string that gets fed to an interpreter by the primary language for the specific operations it's good at. I do not consider regex a language because the definition of loop would have to be stretched to call it turing complete, and I would hate to think about calling functions or 3rd party libraries with it (no idea if that's even possible in APL but even if it is, I doubt I'd use it).
-
I have not spent much time with pure functional languages at all - not enough to have any qualified opinion. For one project, 30 years ago, we evaluated Erlang, but rejected it for our use. My main experience with Lisp is from emacs :-) F# and Haskell I know at "Wikipedia level", not from any practical use. My professional upbringing is from the days when software designers still did data modelling. I know it is not comme il faut to state, in 2022, that Entity Relationship modelling has something to be said in favor of it, but I have seen ER being used very successfully in a number of projects to really get a grasp on the problem domain. And, it is an excellent tool for communicating with a customer: They will easily understand the concepts so that they can participate in development of the ER model, and when that is in place, they can successfully teach you the operations to be done on the data. (And sometimes they also realize that the current state of data collections and handling procedures is a mess ...) So I have always been on the data model side, rather than the function oriented one. I guess that is an essential reason why I consider strong typing essential.
Agreed that strongly-typed is very important. Example: I semi-recently had to debug a C# method that had not been touched in 3+ years. The return type was expando. There were 6 returns. One of them was missing a property. It took 3 years to run into a niche case in prod where that return was hit. If C# was used correctly, strongly typed, it would have been a compiler error 3 years ago and no problem in prod. It took 2 days to find because the line number that attempted to access the missing property was in another part of the app completely; the function that called the expando-method turned the result to json and stuck it in the db for later retrieval by the method that would eventually fail, on a monstrous line that should have been 20 different lines and the error was an unhelpful null reference that could have been any property. Which is also why I try to avoid the fluffy shortcuts like anonymous methods and types.
-
Agreed that strongly-typed is very important. Example: I semi-recently had to debug a C# method that had not been touched in 3+ years. The return type was expando. There were 6 returns. One of them was missing a property. It took 3 years to run into a niche case in prod where that return was hit. If C# was used correctly, strongly typed, it would have been a compiler error 3 years ago and no problem in prod. It took 2 days to find because the line number that attempted to access the missing property was in another part of the app completely; the function that called the expando-method turned the result to json and stuck it in the db for later retrieval by the method that would eventually fail, on a monstrous line that should have been 20 different lines and the error was an unhelpful null reference that could have been any property. Which is also why I try to avoid the fluffy shortcuts like anonymous methods and types.
That is what I call a submarine error. You are sailing along nicely, and out of no where, it pops up and blows you out of the water. “Where did that (null) come from?” Nice catch. This where a constructor that requires all values might protect you. Or else have a validate method such that you return obj.validate(); I remember one particular nasty code path, but with only 4 paths. I defined a handful of booleans and left them uninitialized. In each code branch, I initialized some of the booleans. At the bottom of the method, I added a check like Bool cya = b1 or b2 or b3 etc. Compiler would catch me if it I tried to use an uninitialized variable. After testing, you can remove the extra checks.
-
Well, APL is quite readable to mathematicians. (Or at least a certain share.) APL was not developed as a programming language at all, but as a notation for teaching the math of matrices. Kenneth Iverson used it as "blackboard language" for a number of years when lecturing at Harvard in the late 1950s. It wasn't until he quit teaching and moved to IBM some of his workmates said something like "Hey, we could make a computer do these operations!" But I must admit that you are right: APL is definitely not a language that you will learn in two weeks. And you must keep on using it, otherwise you will forget so much that you will have to dig up your old APL textbook to find any sense in the APL programs you wrote yourself a few years ago. My first programming language was Basic - a very basic one, with at most 286 variables named A-Z or A0-A9 to Z0-Z9, and 26 string variables A$-Z$. My next language was APL, and I was extremely impressed by its power, compared to Basic. (I was 17 at the time.) I still have my APL textbook, and I'd love to pick it up again - just for doing programming that is very different from at least 95% of all programming done today. (I just need a tuit.)
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
My categories ain't terribly different from yours. I categorize by what they value: -backwards-continuity -a certain principle or set thereof -pragmatism as in "getting things done without standing in the way", which would be the good ones
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
The good, the bad and the ugly. Ugly) The languages that my colleagues prefer that I don't like Bad) Languages that neither my colleagues or I use good) The languages I use.
Nothing succeeds like a budgie without teeth.
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
Honestly mostly I just think in terms of whether or not a given language is suitable for my current task.
-
Well, APL is quite readable to mathematicians. (Or at least a certain share.) APL was not developed as a programming language at all, but as a notation for teaching the math of matrices. Kenneth Iverson used it as "blackboard language" for a number of years when lecturing at Harvard in the late 1950s. It wasn't until he quit teaching and moved to IBM some of his workmates said something like "Hey, we could make a computer do these operations!" But I must admit that you are right: APL is definitely not a language that you will learn in two weeks. And you must keep on using it, otherwise you will forget so much that you will have to dig up your old APL textbook to find any sense in the APL programs you wrote yourself a few years ago. My first programming language was Basic - a very basic one, with at most 286 variables named A-Z or A0-A9 to Z0-Z9, and 26 string variables A$-Z$. My next language was APL, and I was extremely impressed by its power, compared to Basic. (I was 17 at the time.) I still have my APL textbook, and I'd love to pick it up again - just for doing programming that is very different from at least 95% of all programming done today. (I just need a tuit.)
As an applied math major at an engineering-focused school back in the 80s I actually had APL as a required 1-credit course. I kind of enjoyed it *because* it was a little arcane.
-
How do you categorize languages? A recent article on a certain mailing list has debunked "compiled" vs "interpreted". I have long stood by my 3-mutually-exclusive-category system: Type 1) "Hey, look what I can do in only 7 lines!" (Python, C#, most new languages etc.) Type 2) "Hey, look what I can do in only 7 characters!" (Perl, awk, golf-oriented gibberish) Type 3) The good ones.
A lack of planning on your part does not constitute an emergency on mine.
1)$$$ 2)$$ 3)$ 4)Nada
-
In my opinion: No. Typing should be explicitly visible in the program text, and clearly identified as a type. Polymorphism, through subclasses, is OK. You can force run time type errors through casting, but casting is explicit. As pointed out: No language is absolutely bound to being interpreted or compiled, but strict typing leans quite strongly towards a complete parsing of the source code before execution starts. When you do that, why not go all the way and generate the code? So strong typing leans towards compilation rather than interpretation, although not by definition.