Building a new proper left-to-right executing programming language
-
Oh I get that, my confusion is how does that decide how we should use the symbols. Maths is the language, and Maths decides how the language should be read. Not the language where some of the symbols originated.
We had maths before we had symbols for numbers. Then, we found that the Roman system has its limitations in doing math. So we switched.
Bastard Programmer from Hell :suss: If you can't read my code, try converting it here[^] "If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
-
It depends on who you are trying to communicate with. A computer or a programmer. From a computers perspective, if you want to model the execution steps, there is already a better language for this representation, reverse polish notation. (a 12 +) From a programmers perspective, you want the code to be easy to read and I don't see how your proposal assists this. - When reading code the variable being set is the most important. - If I'm trying to understand code, I'll want to understand where a var is set, this is easier to do by scanning a block where the variables are aligned. It is easier for my eyes to find the line x=..... rather than .....= x. - Following the logic of the algorithm would involve understanding where variables are mutated as much as what they are set to. - For complex expressions, I'll probably only read and understand them once, while I'll explore the looping logic and structure of the flow of the code more. 60 years ago, there was an economic value in a programmer spending significant time making things easier for the computer. Now, the value is in making things easier for the programmer, even if significantly more complex for the computer. I've always wanted languages to adopt a true assignment operator x <- a + 12. But it would need to be a single character and exist as an easily usable key on my keyboard. Interestingly, Visual Studio allows unicode variable names, so I've written software using genuine alpha and beta glyphs.
NeverJustHere wrote:
I've written software using genuine alpha and beta glyphs
Heaven help you if you ever look at the source code on a machine where the system locale is not set to English. I had a case the other day where a Greek mu (µ) character in the UI was showing up as a kanji character. English version of Windows, check. User language set to English, check. English keyboard selected, check. ... Wait a second. Who set the bloody-be-damned system locale to Simplified Chinese? At some point the box had been restored to the original image, and this was during a brief period when our out-sourced assembly folks were setting all of our boxes to Simplified Chinese. Grrr... :mad:
Software Zen:
delete this;
-
Most of spoken languages are written LtR (Left-to-Right) but Maths, the number language, is actually written RtL because the decimal numbers are Arabic numerals. (I know, most will get surprised, but it's true about Maths RtL direction.) But somehow the RtL and LtR languages got mixed up. Instead of writing
x = a + 12
how about changing it to
a + 12 = x
So, what are your views on creating a new programming language which follows proper LtR execution? Is there already such language? (Please, just don't remind me that there are already lots of programming languages (I know already) and I must not (try to) create one more. :) )
Hold on. Are we talking about computer programming languages, or people programming languages? Because computers don't even know what right and left are, so they don't care. If you're really desperate to fix this problem-that-ain't-even-remotely-a-problem, then use a modular approach, where which "direction" the flow goes depends on the structure of the source data and whatever overloading you have set up.
I wanna be a eunuchs developer! Pass me a bread knife!
-
I am not sure but the direction could have changed in last 100-400 years because of the influence of the western culture.
-
Just because they may have originated in another language does not mean they are still that language. It depends on the context used. Otherwise you could easily argue a lot of English is not actually English because it originated in another language... it doesn't matter where it came from it's only how it's used that matters. If your argument is that Maths is it's own language, then it can also defined it's own read order (i.e. LtR), it doesn't matter where the numbers original came from and how they were originally read. My point is, there is no reason to why they have to be read RtL just because they are decimal numbers.
musefan wrote:
If your argument is that Maths is it's own language, then it can also defined it's own read order (i.e. LtR), it doesn't matter where the numbers original came from and how they were originally read.
My concern is about programming, not about Maths. So, Maths can have its own read-write direction but in programming we can define what is logical - because programming is all about logic, isn't it?
-
Nikunj_Bhatt wrote:
So, what are your views on creating a new programming language which follows proper LtR execution?
What "problem" would that solve? Yeah, I didn't think so. There's plenty of languages to know already, I don't think anybody wants another one that only "fixes" this.
It may not solve any problem. I presented my thought from logical view as programming is all about logic. I have already wrote that I know that there are already plenty of programming languages; I am not actually going to create any language. :)
-
Hold on. Are we talking about computer programming languages, or people programming languages? Because computers don't even know what right and left are, so they don't care. If you're really desperate to fix this problem-that-ain't-even-remotely-a-problem, then use a modular approach, where which "direction" the flow goes depends on the structure of the source data and whatever overloading you have set up.
I wanna be a eunuchs developer! Pass me a bread knife!
I am talking about computer programming language having more logical syntax while remaining fairly easy to understand for programmers.
-
It depends on who you are trying to communicate with. A computer or a programmer. From a computers perspective, if you want to model the execution steps, there is already a better language for this representation, reverse polish notation. (a 12 +) From a programmers perspective, you want the code to be easy to read and I don't see how your proposal assists this. - When reading code the variable being set is the most important. - If I'm trying to understand code, I'll want to understand where a var is set, this is easier to do by scanning a block where the variables are aligned. It is easier for my eyes to find the line x=..... rather than .....= x. - Following the logic of the algorithm would involve understanding where variables are mutated as much as what they are set to. - For complex expressions, I'll probably only read and understand them once, while I'll explore the looping logic and structure of the flow of the code more. 60 years ago, there was an economic value in a programmer spending significant time making things easier for the computer. Now, the value is in making things easier for the programmer, even if significantly more complex for the computer. I've always wanted languages to adopt a true assignment operator x <- a + 12. But it would need to be a single character and exist as an easily usable key on my keyboard. Interestingly, Visual Studio allows unicode variable names, so I've written software using genuine alpha and beta glyphs.
Quote:
there is already a better language for this representation, reverse polish notation. (a 12 +)
You were so close! Use prefix notation instead
(+ a 12)
and then you already have a powerful language with modern language features. Postfix notation
(a 12 +)
gives you Forth, with little expressive power. Prefix notation
(+ a 12)
gives you Lisp, with all currently known language features.
-
Most of spoken languages are written LtR (Left-to-Right) but Maths, the number language, is actually written RtL because the decimal numbers are Arabic numerals. (I know, most will get surprised, but it's true about Maths RtL direction.) But somehow the RtL and LtR languages got mixed up. Instead of writing
x = a + 12
how about changing it to
a + 12 = x
So, what are your views on creating a new programming language which follows proper LtR execution? Is there already such language? (Please, just don't remind me that there are already lots of programming languages (I know already) and I must not (try to) create one more. :) )
Natural and formal languages (math being one of the latter, programming languages are another example) have different use cases. As hard as it is to create an exact statement in, I dare to say, every single natural language (although some are better for this task than others), as easy it is in the formalized language of math or programming (I dare to say that C++'s convoluted syntax is an exception here). My point is that applying a set of rules not developed for formality to something that has to be formal may not yield the best results.
-
If you really want to program in RPN there is Forth. In my brief encounter with it I deemed it a write-only language. At least, there was Forth. I have not heard much of it many years.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
Forth is still out there. It's still being used as an intermediate language at a place I worked for years ago. We (actually I) decided to write a multitasking subroutine threaded Forth for our industrial controllers. On the PC we developed an IDE where the programmer would just develop flow charts. Each flow chart would become a task on the industrial controller. The flow charts were then compiled to Forth by the IDE and would then be downloaded to the controller which would compiled it to machine code.
-
Most of spoken languages are written LtR (Left-to-Right) but Maths, the number language, is actually written RtL because the decimal numbers are Arabic numerals. (I know, most will get surprised, but it's true about Maths RtL direction.) But somehow the RtL and LtR languages got mixed up. Instead of writing
x = a + 12
how about changing it to
a + 12 = x
So, what are your views on creating a new programming language which follows proper LtR execution? Is there already such language? (Please, just don't remind me that there are already lots of programming languages (I know already) and I must not (try to) create one more. :) )
Or use a good one, already invented. POP-2 - Wikipedia[^] With lambdas, managed mem, closures (full and partial), user-defined operators, user-defined setter functions, functions with multiple results, incremental compiler ... And with alternative ltr syntaxes: `f(a,b) ->x ->y` or `a; b.f() ->x ->y`
-
Forth is still out there. It's still being used as an intermediate language at a place I worked for years ago. We (actually I) decided to write a multitasking subroutine threaded Forth for our industrial controllers. On the PC we developed an IDE where the programmer would just develop flow charts. Each flow chart would become a task on the industrial controller. The flow charts were then compiled to Forth by the IDE and would then be downloaded to the controller which would compiled it to machine code.
That sounds interesting. Other than it uses Forth. :cool: That last thing I read about Forth was many years ago. It was about the development of the SPARC processor and Sun workstations. They embedded Forth in its ROMs and wrote the boot loader in it. As I recall, it came up and ran on the first attempt.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
-
Most of spoken languages are written LtR (Left-to-Right) but Maths, the number language, is actually written RtL because the decimal numbers are Arabic numerals. (I know, most will get surprised, but it's true about Maths RtL direction.) But somehow the RtL and LtR languages got mixed up. Instead of writing
x = a + 12
how about changing it to
a + 12 = x
So, what are your views on creating a new programming language which follows proper LtR execution? Is there already such language? (Please, just don't remind me that there are already lots of programming languages (I know already) and I must not (try to) create one more. :) )
You can already read this right to left. "=" is read as "is assigned to" not "equals". duh!
x = a + 12
Could be read as
12 added to a is assigned to x
Or left to right:
x is assigned a added to 12
Is your language going to support order of operations that follow neither direction?
-
Most of spoken languages are written LtR (Left-to-Right) but Maths, the number language, is actually written RtL because the decimal numbers are Arabic numerals. (I know, most will get surprised, but it's true about Maths RtL direction.) But somehow the RtL and LtR languages got mixed up. Instead of writing
x = a + 12
how about changing it to
a + 12 = x
So, what are your views on creating a new programming language which follows proper LtR execution? Is there already such language? (Please, just don't remind me that there are already lots of programming languages (I know already) and I must not (try to) create one more. :) )
Forth (programming language) - Wikipedia[^] It never fails to amuse me how the young ones forget their history, if, indeed, they bother to learn it in the first place. Oh - and it's
x @ 12 + x !
-
That sounds interesting. Other than it uses Forth. :cool: That last thing I read about Forth was many years ago. It was about the development of the SPARC processor and Sun workstations. They embedded Forth in its ROMs and wrote the boot loader in it. As I recall, it came up and ran on the first attempt.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
I liked Forth a lot, but then I grew up on assembly language. The whole TIL (threaded interpreted language) scheme is extremely simple and is easily ported to different processors. One of the main problems with Forth is that the programmer is assumed to be an expert. There's pretty much no hand holding. Forth Inc. is still in business too (www.forth.com).
-
I liked Forth a lot, but then I grew up on assembly language. The whole TIL (threaded interpreted language) scheme is extremely simple and is easily ported to different processors. One of the main problems with Forth is that the programmer is assumed to be an expert. There's pretty much no hand holding. Forth Inc. is still in business too (www.forth.com).
I didn't care for Forth. I didn't grasp it immediately and it was always a struggle for me to deal with. The same applies to RPN for me. I think I was the only one in my engineering school who didn't have an HP calculator. Coincidentally, I went to school in the same town where HP designed and built them at the time.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
-
I didn't care for Forth. I didn't grasp it immediately and it was always a struggle for me to deal with. The same applies to RPN for me. I think I was the only one in my engineering school who didn't have an HP calculator. Coincidentally, I went to school in the same town where HP designed and built them at the time.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
-
Most of spoken languages are written LtR (Left-to-Right) but Maths, the number language, is actually written RtL because the decimal numbers are Arabic numerals. (I know, most will get surprised, but it's true about Maths RtL direction.) But somehow the RtL and LtR languages got mixed up. Instead of writing
x = a + 12
how about changing it to
a + 12 = x
So, what are your views on creating a new programming language which follows proper LtR execution? Is there already such language? (Please, just don't remind me that there are already lots of programming languages (I know already) and I must not (try to) create one more. :) )
My views ? 1. you are wasting your time. 2. using = for assignment is evil, but, perhaps a necessary one we are stuck with forever. 3. post-fix (RPN) is no more natural, or unnatural, than any other notation. a great benefit of RPN is that you can parse it without need for recursive descent to figure out execution order.
«Where is the Life we have lost in living? Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?» T. S. Elliot
-
I like it because of it's interactive nature. Write a 'word' and you can test it immediately. Made for much quicker development at the time. Also, it was relatively easy to make a multitasking Forth (round robin scheduling).
Take a look at the PostScript manual. As far as I can see its FORTH with extra graphics bits.
-
Take a look at the PostScript manual. As far as I can see its FORTH with extra graphics bits.