Book Recommendation
-
I qualify as an "old timer" and am still working albeit about as far away from my earliest software work as could be. I wouldn't use VI unless my life depended on it not because of any aversion to full screen editors. I'd rather use notepad. Now as to "old timers" and abandoned technology, even though I spent first 15-20 years of my career programming ASM on various machines writing everything from OSs to devices driver, to compilers, etc. do I use ASM today, or prefer it? I use the most efficient tool appropriate to the task at hand. As for teaching ASM, I do wonder where the ASM programmers will come from to write the inevitable code that cannot be written in C (or whatever high level language you choose). Somewhat amazing to me that a computer science major can graduate w/o understanding how a computer works, at least at the basic level of ASM. Black Hats cannot be the ONLY folks that understand ASM or we're all in a lot of trouble.
The only programmers who really need to know the machine code (whether considered as binary instruction codes or symbolic assembly language) will be compiler writers. Knowing all the details of the instruction set, adressing modes, status bits etc. is highly specialized knowledgde, needed by very few others. It is like the huge matrix models managed by meteorology software, implementing trancendental functions for a math library or the light model of a 3D graphics package. We didn't learn meteorology or FEM algorithms in college; those who need it, learn it at work (or maybe they study meteorology in college and learn programming at work). We did learn the transistor design for dynamic and static RAM cells - never needed it! We did learn the series expansions for trigonometric functions - never needed it. We did assembler programming exercises, and I did need it for about five years, but not for the last 30 years. We did learn nine (or was it eleven) different disk scheduling algorithms, made completely irrelevant by disks with megabytes of cache and virtualized track/sector numbers. We learned lots of ways to manage a heap; I used the knowledge for twentyfive years, fully convinced that nothing could beat explicit malloc/free. Then I read about CLR garbage collection (in "CLR via C#"), and had to admit that "Oops, I never though of that ... and that and...". No doubt: CLR garbage collection is a lot smarter than any memory handling I have coded myself. Any modern compiler makes optimizations that you never would have thought of. My company develops processing modules for embedded systems: I don't think there is a single assembly instruction anywhere in our code. Even our in-house CPU extensions, on-chip "peripherals" (like BT radio, encryption unit, sensor interfaces etc.) are managed through general C library functions. I am not sad that young programmers no longer learn the transistor design of a flip-flop, how to use Newton's method when implementing a math function library or to judge FCFS against elevator disk scheduling. Such knowledge doesn't help you write values to RAM in a better way, to aim that flame thrower in the right direction with higher precision or reliability, or to sort the queue of two entries in the most efficient way before sending the request to the disk. Programmers still need an understanding of a lot of hardware aspects: Word length / numeric range and limited FP precision is one prime example. But that is on the architectural level, not implementation.
-
If that option is available, it might be a very good option. Lots of people live too far away from a college. Even if there is a local college, the course may be taught at times when you cannot leave your ordinary work. Or admission to the college requries that certain formalities are in place, e.g. that the course is available only to full time students. Finally (this might be a bigger problem in Europe than in the US): Some colleges/universities fiercely cling to the idea that Windows or anything else coming from MS is toy software - Real, Professional Software is Linux based (and with a command line interface, not a GUI). All basic courses are based on Linux and open-source software; Windows software/tools are introduced only as one of several options in courses for specializing in end user application development. Lots of newly educated bachelors and masters spend years of frustration when entering the working life, realizing how much toy software is out there, and how difficult it is to enlighten people about the blessings of Linux and command line interfaces. If your local college is of that sort, you can go there to learn Linux and C (and possibly python), but you may search in vain for C#, VS and dotNet related courses.
-
Quote:
The first language / environment you learn is like your first sweetheart - you'll carry joyful memories from that time for the rest of your life.
My 'first' was BASIC and no, I don't.
Now that you mention it... I should have qualified it: The first serious language... (I started with BASIC, too, when the language was so basic that variables were named A - Z, A0 - Z0 up to A9 - Z9. 286 numeric variables maximum, and 26 string varibables A$ - Z$. You are right: That doesn't bring up any joyful memories. In fact, I had suppressed that memory entirely.
-
I agree with much of what you say; however, I disagree with your premise that a developer doesn't need to know HOW something works. Frameworks are created and abandoned with such intense frequency today that without understanding the basics of those frameworks, it is impossible to know how to proceed forward with the maintenance of software. Far too many developers seem to believe that the software life-cycle is write brand new, leave it and move to a new project. Instead, most software lives a long time with many changes needed through the years. Unless those initial developers and the maintenance engineers who come along have a mutual understanding of HOW coding works, the changes are doomed to fail. Our industry is the current equivalent of urban development: tear down whatever currently exists and build new, over and over. That process keeps the money flowing and builders happy UNTIL there is no money to flow when the entire infrastructure breaks down. At that point, those who understand the basics survive, and those who do not become part of the unemployed masses.
I certainly think you should know the workings of the layer you build your software on, directly below your layer, but not ten layers down. But: You should distinguish between architecture and implementation: The data structures, interactions between functions etc. are essential. If your understanding of the layer below you breaks down if the 32 bit CPU is replaced by a 36 bit CPU (are they still made?), then you have spent your resources wrongly. Or, if the layer below is re-implemented in a different language, but offering the same call interface. I am sceptical to the current trend of googling to find the call interface documentation, and start using it without knowning anything about the architecture below. If I complain, nine out of ten times someone suggests: But it is open software - you can download it and see how it works! ... No, the implementation is NOT the architecture! When you ask for an architetural drawing and is given a house, and told: You can make your own drawing of this house, can't you? then you are wasting my time. You rarely find software "architectural drawings" by googling, that which is independent of coding / implementation. I see that as a big problem. Even more I am outright scared over how large fraction of young software developers, those educated after google, appears to think it is perfectly fine. If it works, there is nothing to worry about. If not, you google for a quick fix. Ask them why that fix cured the problem, and they shrug: "Don't know, but it works now. Good enough for me!" That is not a good approach for writing robust software. And lots of software written today is not robust.
-
If that option is available, it might be a very good option. Lots of people live too far away from a college. Even if there is a local college, the course may be taught at times when you cannot leave your ordinary work. Or admission to the college requries that certain formalities are in place, e.g. that the course is available only to full time students. Finally (this might be a bigger problem in Europe than in the US): Some colleges/universities fiercely cling to the idea that Windows or anything else coming from MS is toy software - Real, Professional Software is Linux based (and with a command line interface, not a GUI). All basic courses are based on Linux and open-source software; Windows software/tools are introduced only as one of several options in courses for specializing in end user application development. Lots of newly educated bachelors and masters spend years of frustration when entering the working life, realizing how much toy software is out there, and how difficult it is to enlighten people about the blessings of Linux and command line interfaces. If your local college is of that sort, you can go there to learn Linux and C (and possibly python), but you may search in vain for C#, VS and dotNet related courses.
Thanks for the thoughts, actually the guys is already working as a government employee as a Police Officer, but he is interested to learn the software development and wants to pursue his career in it. So i was thinking to give him the direction towards .NET Technologies, i would suggest him this course to be taken online as self-paced option is provided by few web sites online.
-
I personally found "CLR via C# 4th Edition" very helpfull.
"Coming soon"
Yeah, that's a good one, as i myself had read first few chapters but not able to finish the book yet :)
-
Why not go down to the basics and buy a box of transistors and a soldering iron, putting together your own machine? When my technical university started teaching computers around 197s, they actually had one computer (NORD-1) delivered as components. The professors though it a good idea that the students got hands on experience in building a computer, even though the architecture (down to the printed circuit boards) were pre-defined. Soldering it together was still considered a valuable learning experience. (I am not quite sure about the technology at that time - I believe it was a mixture of discrete transistors etc. and small scale integration chips, like the 74xxx series). The oldest machine at my university, a GIER, had one side panel that was the control logic as a matrix of feritte cores, directly accessible so you could "microcode" it by pulling the conductors through or outside each core, changing the effect of each instruction code. Something like that could be very useful for a novice that really wants to get to the roots of programming :-)
Well... it depends on if someone has that much time, what i wanted was to quickly get him up to speed to start understanding about c# and start writing small programs in C#, the time constraints applied to that guy and me too.
-
Now that you mention it... I should have qualified it: The first serious language... (I started with BASIC, too, when the language was so basic that variables were named A - Z, A0 - Z0 up to A9 - Z9. 286 numeric variables maximum, and 26 string varibables A$ - Z$. You are right: That doesn't bring up any joyful memories. In fact, I had suppressed that memory entirely.
-
Don{t mind if I do! Please pass the syrup Regards, Walt
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software
-
Don{t mind if I do! Please pass the syrup Regards, Walt
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software