Which programming language to learn
-
CDP1802 wrote:
Memory addresses and how to calculate them are the most fundamental things on a computer.
Yes, but not many people understand the fundamentals any more. Far too many think programming is a question of dragging some tools onto a page, double clicking to generate event handlers, and asking CodeProject for the rest.
One of these days I'm going to think of a really clever signature.
Well said, Richard! :-D
Will Rogers never met me.
-
Well said, Richard! :-D
Will Rogers never met me.
-
I've never been scared of pointers - I started out programming in machine language and assembly. But the notation sucks, and the libraries that were available when I was using it were almost as tedious as writing one's own from scratch (MFC, for instance). I suspect that things are better now, but I'm not interested in looking. :)
Will Rogers never met me.
No, things have not gotten much better. Since the arrival of .Net native C++ has been treated like a stepchild in Visual Studio. On the good side, I can dig out ancient code and get it to work again. So, your concern is not with the language itself, but with the lack of a decent IDE support and a modern library of a similar caliber as the .Net framework?
-
No, things have not gotten much better. Since the arrival of .Net native C++ has been treated like a stepchild in Visual Studio. On the good side, I can dig out ancient code and get it to work again. So, your concern is not with the language itself, but with the lack of a decent IDE support and a modern library of a similar caliber as the .Net framework?
Well, those, plus the fact that the notation is unnecessarily cryptic. When I started out, it was during the time that people realized that maintenance cost more than development, primarily because of unstructured programming, and language syntax that was difficult to read and understand. Languages evolved as a result into ever more readable forms, until C++. That was a giant step backward. It helped with the structure problem, but destroyed any hope of humans being able to read it. Added to that - and for me it was a matter of timing, I guess - was the shift from procedural programming to event-driven programming, popularized by Windows. C++ by itself was a challenge, but not insurmountable. Add Windows, and message pumps and handlers and all the crap that comes with it; it was too much for me to assimilate. C# came along, and saved my butt, at least for the small amount of programming I still do. Fortunately, I don't expect ever again to have to program for a living, though I like to take on an app now and then just to keep my fingers nimble. One never knows where the next job will be, nor what skills it might require. :-D
Will Rogers never met me.
-
Well, those, plus the fact that the notation is unnecessarily cryptic. When I started out, it was during the time that people realized that maintenance cost more than development, primarily because of unstructured programming, and language syntax that was difficult to read and understand. Languages evolved as a result into ever more readable forms, until C++. That was a giant step backward. It helped with the structure problem, but destroyed any hope of humans being able to read it. Added to that - and for me it was a matter of timing, I guess - was the shift from procedural programming to event-driven programming, popularized by Windows. C++ by itself was a challenge, but not insurmountable. Add Windows, and message pumps and handlers and all the crap that comes with it; it was too much for me to assimilate. C# came along, and saved my butt, at least for the small amount of programming I still do. Fortunately, I don't expect ever again to have to program for a living, though I like to take on an app now and then just to keep my fingers nimble. One never knows where the next job will be, nor what skills it might require. :-D
Will Rogers never met me.
This must have been one of the first programs I entered into my old computers to see if I had not fried anything while soldering it together:
0000 7B
0001 3F 00
0003 7A
0004 30 01This is just a simple 'Hello World' type of program. It just turns on a LED and turms it off when you press the input key next to the hex keyboard. Barely enough to show that CPU, memory and I/O are alive and well so far. You know what I like so much about it? It's, besides having been formatted by instructions, absolutely free of style or syntax. Just instruction codes, followed by one or two bytes of data if needed. No other representation (besides assembly code perhaps) can give you a more precise or shorter description of what your code does. There is no potential for misunderstanding or any hidden side effect. Every instruction alters the CPU's state in a precisely defined way. The hexadecimal notation does not disturb me one bit. They have become as readable to me as any other language.
-
Which is in opinion best to start learn from you? HTML? C++? PERL? C++/VB/C#/F#/PYTHON .NET? Any others and why? Have you learned first which one and salary list please Thanks
C for the simple reason that it teaches you best how computers work without getting bogged down in the details of assembly language (a language which I love.) From there, from a purely pragmatic perspective, learn either Java or .NET; yeah, learn both, but become an expert in one or the other. One way is to decide whether you really like Windows and Visual Studio or prefer Linux/UNIX. (Right now, Java is hotter due to Android, but that demand will slip in time.) Python, Perl, HTML may help you at some jobs, but they will be ancillary to Java or .NET. Oh, and learn SQL really well. NOTE: This is coming from a die-hard C/C++ developer who has a profound disinterest in the types of projects using Java and heavy .NET. The result is rather difficult job searches; many employers are looking for jacks of all trades and I'm not that, nor interested in being one. The point is that until I landed my latest job, I fielded calls for all sorts of positions and got a pretty good feel for the market.
-
All languages are pretty much the same (although HTML isn't a language). Learn one, get a job doing it, and the rest will come along as your career develops. Over the last 31 years, I've done the following for money, and learned each one when I needed to learn it: Fortran Cobol CMS-2Y Assembly Pascal Delphi (which is a fancy name for Pascal) Modula-2 dBase2 SQL (Oracle and SQL Server) Ada C C++ C# VB VB.Net PHP HTML It's difficult to specialize AND stay employed. Usually, you can do one or the other, but not both.
".45 ACP - because shooting twice is just silly" - JSOP, 2010
-----
You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass." - Dale Earnhardt, 1997John Simmons / outlaw programmer wrote:
...HTML isn't a language...
Ah well of course it is. HTML is an acronym and the "L" in "HTML" stands for "Language". The full string is Hypertext Markup Language, so of course it is not a programming language but it is a language. XML (unembellished) is also similarly afflicted. I would argue that XSL, based on XML, is a programming language. It has constructs for accumulators, loops, while, switch, if (and other conditionals), variables etc. which I think qualifies it so. HTML makes provision for including ECMAScript/Javascript which is a language, but (similar to XML including XSL) the HTML itself is not the programming part.
-- Harvey
-
John Simmons / outlaw programmer wrote:
...HTML isn't a language...
Ah well of course it is. HTML is an acronym and the "L" in "HTML" stands for "Language". The full string is Hypertext Markup Language, so of course it is not a programming language but it is a language. XML (unembellished) is also similarly afflicted. I would argue that XSL, based on XML, is a programming language. It has constructs for accumulators, loops, while, switch, if (and other conditionals), variables etc. which I think qualifies it so. HTML makes provision for including ECMAScript/Javascript which is a language, but (similar to XML including XSL) the HTML itself is not the programming part.
-- Harvey
It might stand for "Language", but in all actuality, it's a markup "specification". It should have been called HTMS. And following that, XMS and XAMS...
".45 ACP - because shooting twice is just silly" - JSOP, 2010
-----
You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
-----
"Why don't you tie a kerosene-soaked rag around your ankles so the ants won't climb up and eat your candy ass." - Dale Earnhardt, 1997 -
This must have been one of the first programs I entered into my old computers to see if I had not fried anything while soldering it together:
0000 7B
0001 3F 00
0003 7A
0004 30 01This is just a simple 'Hello World' type of program. It just turns on a LED and turms it off when you press the input key next to the hex keyboard. Barely enough to show that CPU, memory and I/O are alive and well so far. You know what I like so much about it? It's, besides having been formatted by instructions, absolutely free of style or syntax. Just instruction codes, followed by one or two bytes of data if needed. No other representation (besides assembly code perhaps) can give you a more precise or shorter description of what your code does. There is no potential for misunderstanding or any hidden side effect. Every instruction alters the CPU's state in a precisely defined way. The hexadecimal notation does not disturb me one bit. They have become as readable to me as any other language.
From memory I read this as (BBC 6502 code): LDA &3F ` This will be wrong - it will produce a single byte address. STA &1030 I bet I am wrong though ;-)
-
From memory I read this as (BBC 6502 code): LDA &3F ` This will be wrong - it will produce a single byte address. STA &1030 I bet I am wrong though ;-)
Not quite. It would work on most early RCA CPUs like a CDP1802, CDP1804 or CDP1805. I'm not so sure about the CDP1801. It had a smaller instruction set than the later CPUs. And that was their greatest advantage. They had many registers and almost no addressing modes which makes them (as far as I know) the first RISC CPUs :)