Is there a programming language...
-
Rob Grainger wrote:
Don't know if your familiar with Haskell at all, but its a wonderful thing.
Nope, but I've been reading up on it since your previous post!
Rob Grainger wrote:
I think I've learned more from learning Haskell than any language since I learned Smalltalk.
Interestingly, I learned more about the principles of programming from this book[^] than I ever have from any actual comp-sci book. I kid you not - biology and programming have a lot in common. Marc
I am not surprised. Alan Kay has stated that biology heavily influenced the invention of OOP. I think he touches on that in This interview[^]
"If you don't fail at least 90 percent of the time, you're not aiming high enough." Alan Kay.
-
...that works "easily" with semantic types? For example, I may have:
int age = 51;
which completely loses the concept that 51 is an age (in years). What I want is something like:AgeInYears myAge = 51;
and yet still be able to specify that I can perform, say, arithmetic operations on "myAge". For example, in C#, I could write:class AgeInYears
{
public int Value {get;set;}
}... implement operators on AgeInYears
But that gets messy real fast - every "semantic type" needs these operators, etc. Furthermore, the unit of measurement is still not handled very elegantly. So, as the question states, are there programming languages out there that are more expressive of semantic types? Marc
You could use C/C++ typedef int AgeInYears; AgeInYears myAge = 51; No need to define operators etc. You could even use Ada type AgeInYears is new Integer; type AgeInMonths is new Integer; myAge: AgeInYears; yourAge: AgeInMonths; Ada won't let you type myAge := yourAge / 12; unless you cast it.
-
...that works "easily" with semantic types? For example, I may have:
int age = 51;
which completely loses the concept that 51 is an age (in years). What I want is something like:AgeInYears myAge = 51;
and yet still be able to specify that I can perform, say, arithmetic operations on "myAge". For example, in C#, I could write:class AgeInYears
{
public int Value {get;set;}
}... implement operators on AgeInYears
But that gets messy real fast - every "semantic type" needs these operators, etc. Furthermore, the unit of measurement is still not handled very elegantly. So, as the question states, are there programming languages out there that are more expressive of semantic types? Marc
An interesting language for doing this type of thing is Julia (http://julialang.org/[^]). In Julia you can A) Use typedefs to indicate that AgeInYears is a typedef for Int or B) Make a new type AgeInYears that is a subtype of Integer and implement a converter function (these are a standard Julia concept) from Int to AgeInYears so that age::AgeInYears = 5 will resolve correctly.
-
greydmar wrote:
I believe that this threshold "semantic" is outside the domain of a programming language (commonly, it is a "system domain" concept),
Yes, that's what makes it interesting to look at. :) Marc
Well, lets consider the following task: Imagine the same issue when your development includes (for example) some SQL tables with some (or a lot of) semantic fields (unit of measure, complex number, currency, etc.) and you decides (of course, you are a "programer"!! ;P) performs some calculations using SQL dialects (T-SQL; stored procedures, udf, package, PL/SQL)... So, it's a kind of art do ordnary operations (conversion, arithmetics) without "semantic types", no?
-
Stroustrup has written somewhere on implementing SI measures in C++ using user-defined literals and simple classes. That would work well, but should really be part of the standard library.
"If you don't fail at least 90 percent of the time, you're not aiming high enough." Alan Kay.
-
I am not surprised. Alan Kay has stated that biology heavily influenced the invention of OOP. I think he touches on that in This interview[^]
"If you don't fail at least 90 percent of the time, you're not aiming high enough." Alan Kay.
Rob Grainger wrote:
Alan Kay has stated that biology heavily influenced the invention of OOP
Ancient Greeks established paradigm of capture world events into models (abstract concepts schemes) without cares with empirical verification... Biology and statistics with your main concepts (store, compare, infers!) had heavily influenced since Medicine up to Mechanical sciences working from a diferent perspective: World evidences first (facts, events, observations). Model concepts later. An interesting book about this, available only in portuguese (BEHIND THE SCENES OF SCIENCE - Resistances Scientists to Scientific Innovation) it's a very enlightening
-
An interesting language for doing this type of thing is Julia (http://julialang.org/[^]). In Julia you can A) Use typedefs to indicate that AgeInYears is a typedef for Int or B) Make a new type AgeInYears that is a subtype of Integer and implement a converter function (these are a standard Julia concept) from Int to AgeInYears so that age::AgeInYears = 5 will resolve correctly.
aschmahmann wrote:
An interesting language for doing this type of thing is Julia (http://julialang.org/[^]).
Reading the docs, that looks very very interesting! Thank you for pointing out Julia! Marc
-
Rob Grainger wrote:
"If you don't fail at least 90 percent of the time, you're not aiming high enough."
Well. Try this with your employer! (it's a joke)
I still feel that would be an improvement on some of the folk whose footsteps I'm following (see various entries in The Wierd and The Wonderful) - at least I'd succeed 10% of the time ;-)
"If you don't fail at least 90 percent of the time, you're not aiming high enough." Alan Kay.
-
I hate myself for typing this:
namespace TestApp1
{
using AgeInYears = System.Int32;class Program { static void Main(string\[\] args) { AgeInYears myAge = 10; AgeInYears oldAge = 50; AgeInYears timeUntilOldAge = oldAge - myAge; } }
}
Yes, that's perfectly legal C# code. Its technically an int, works the same way that #define does in c++ to replace types. It only works in single code files though.
-
Kinda, as I said in a different reply, typedef has valid uses (defining a BOOL for example) that actually creates a new type and can be universal, whereas "using" alias is more like a #define replacement in a single file.
-
I suppose typedefs get type checked at compile time and #defines do not. Is that correct? so maybe the typedef equivalent is the safer way to go?
David
#define will be checked at compile time just like typedefs because the preprocessor will run through and replace all instances of the #define name with the value before compiling. If typedef had an equivalent c# construct that would certainly be the better way to go, but it doesn't. So the only real way to redefine types as a different name is to use the "using" or to create your own value type that derives from the equivalent type.
-
...that works "easily" with semantic types? For example, I may have:
int age = 51;
which completely loses the concept that 51 is an age (in years). What I want is something like:AgeInYears myAge = 51;
and yet still be able to specify that I can perform, say, arithmetic operations on "myAge". For example, in C#, I could write:class AgeInYears
{
public int Value {get;set;}
}... implement operators on AgeInYears
But that gets messy real fast - every "semantic type" needs these operators, etc. Furthermore, the unit of measurement is still not handled very elegantly. So, as the question states, are there programming languages out there that are more expressive of semantic types? Marc