Assembly Step-by-Step: First read this book in 1993
-
I didn't read that book. But I did learn it, because Assembly still was part of the topics in college for me in 2000-2002. That's what helped me getting good really fast at LAD in PLC programming. Not the same, but similar enough.
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
When I was teaching various aspects of programming and data communication at a Tech College in the early 1990s, we saw it as essential that the programming students had at least some understanding of what happened to their programs after being compiled. So I taught a course in elementary computer architecture - the ideas of ALU functions, registers, busses, instruction and addressing formats ... No implementation technology, only those aspects relevant to the software developer. In this course, homework assignments were x86 assembly coding. (I lost the battle to get some 68K machines to the college, which would have been a great advantage in teaching clean, non-messy architectures. But I lost.) We certainly did not intend to teach the students assembly as a viable development tool; its primary purpose was to give them a 'hands on' feeling of the implications and limitations of a processor architecture. I still believe that this is The Essential Aspect of learning assembler coding: To understand what a CPU is really like. You can take advantage of that understanding when writing high level code.
-
Jeff Duntemann just released the 4th ed of his fantastic assembly language book and I've started reading it. x64 Assembly Language Step-By-Step: Programming With Linux[^] I read the first edition of this book way back in '93[^]. It was sitting on a shelf at work and no one had read it. This book has quite a lot of history in it and it's a really great read. The author was the first who helped me understand math in different number bases. He does a great job of explaining things simply. Have any of you read any versions of the book? It's a great read. It's interesting to follow a book for over 30 years. Wow!
I had the second edition but lost that in a fire. I bought the third edition and still have that, and tomorrow I will have the 4th edition! I've followed Jeff since his PC Tech Journal (Magazine) and Delphi (Pascal) days.
-
Jeff Duntemann just released the 4th ed of his fantastic assembly language book and I've started reading it. x64 Assembly Language Step-By-Step: Programming With Linux[^] I read the first edition of this book way back in '93[^]. It was sitting on a shelf at work and no one had read it. This book has quite a lot of history in it and it's a really great read. The author was the first who helped me understand math in different number bases. He does a great job of explaining things simply. Have any of you read any versions of the book? It's a great read. It's interesting to follow a book for over 30 years. Wow!
I am curious: Those of you buying/reading a book of this kind, why do you read it? 1) Because you need to understand the instruction set to create a compiler, interpreter etc. for code written in a high level language. 2) To write actual production code (that includes drivers and such) in assembly language. 3) Just because you are curious about the instruction set / architecture, but not intending to produce any production code in assembly language. I am definitely in the third category. I do read instruction set set manuals. 13 years ago (that was the last time) I delivered a module that had to stay below 1200 bytes of code (it clocked in at 1103 bytes): Programming was done in C. I do suspect that a major fraction of those claiming to use assembler in their production code talk about a handful of instructions, written as inline assembly in C, or maybe instructions wrapped into library provided intrinsics. A few developers still deliver modules written exclusively in assembly code, but those are few and far between. I'd be surprised if there are enough of those to justify the publication of a book (and most of them wouldn't need that book anyway :-)). Yet, I welcome the books! Their authors should realize that their primary audience is not those who will create production code as assembly written modulse, but those needing to understand the nature of that animal they are trying to master through their high level language.
-
An 18,000 line driver in those days? That sounds rather extreme, for a driver. I mean, after Bill Gates had granted 640K of the 1M address space for application code, which should be enough for everybody, only 384K is left for the OS and drivers. Most instructions take up 2 or 3 bytes. Assuming that most of your 18,000 lines were instructions, then your driver would alone fill something like 10% of the total system space in RAM! Admittedly: I never studied half an OS. The memory limits were probably less constrained than in DOS - but the physical memory was still limited in those days, and for the most parts, drivers need to stay resident. Sidetrack: It is a long time since I heard to old professor emeritus'es referred to as 'TSRs'. Young people of today never learned that term. :-)
-
I am curious: Those of you buying/reading a book of this kind, why do you read it? 1) Because you need to understand the instruction set to create a compiler, interpreter etc. for code written in a high level language. 2) To write actual production code (that includes drivers and such) in assembly language. 3) Just because you are curious about the instruction set / architecture, but not intending to produce any production code in assembly language. I am definitely in the third category. I do read instruction set set manuals. 13 years ago (that was the last time) I delivered a module that had to stay below 1200 bytes of code (it clocked in at 1103 bytes): Programming was done in C. I do suspect that a major fraction of those claiming to use assembler in their production code talk about a handful of instructions, written as inline assembly in C, or maybe instructions wrapped into library provided intrinsics. A few developers still deliver modules written exclusively in assembly code, but those are few and far between. I'd be surprised if there are enough of those to justify the publication of a book (and most of them wouldn't need that book anyway :-)). Yet, I welcome the books! Their authors should realize that their primary audience is not those who will create production code as assembly written modulse, but those needing to understand the nature of that animal they are trying to master through their high level language.
Great questions. I originally read the 1st edition because I was just learning programming and it helped me understand how the "machine" operates at it's most basic level. Now, all these years later I've done quite a bit of Arduino programming. And, not just "Arduino" programming but attempts to build a complete product based around a microprocessor platform. At one point, I was building a room temperature monitor which allowed the user to see the specific temperature in any room in the house via checking a phone app which read and controlled the temperature monitor via bluetooth. You could also turn on temperature watching and it would write values to an SD card so you could discover if the room was a cold / hot zone by examining the sampled data. At that time, the code was so large I needed to use the ATMega4809 -- which is actually the main chip on the Arduino Nano Every. That chip has more program memory than the basic ATMega328 (on basic Arduino). However, I was using the DIP version of the chip because the Nano Every was like $10 and the chip was $4. I had to learn to use Microchip studio and an ICE programmer to program the chip. Had to learn all kinds of things about the chip. But I still hadn't learned assembly on the chip and I'd like too, but there are huge gaping holes in my assembly understanding so I was hoping learning some on Linux would help me learn it on other platforms. I also found it so fascinating that putting a voltage on a pin of an Arduino caused the chip to do some specific thing. I like that assembly makes me think in such a different way than high level languages do and that makes me think of new things to try.
-
On a given processor / hardware? Or because the designers decided that less than 640 K would be enough for everybody? OS/2 certainly didn't predate large-machine virtual memory, but am fairly convinced that it did predate widespread hardware support for virtual memory management on the x86 architecture.OS/2 was designed to run at pre-386 architectures, wasn't it? Correct me if I am wrong! Nevertheless, an 18,000 lines driver for a PC was rather massive at that time!
-
Great questions. I originally read the 1st edition because I was just learning programming and it helped me understand how the "machine" operates at it's most basic level. Now, all these years later I've done quite a bit of Arduino programming. And, not just "Arduino" programming but attempts to build a complete product based around a microprocessor platform. At one point, I was building a room temperature monitor which allowed the user to see the specific temperature in any room in the house via checking a phone app which read and controlled the temperature monitor via bluetooth. You could also turn on temperature watching and it would write values to an SD card so you could discover if the room was a cold / hot zone by examining the sampled data. At that time, the code was so large I needed to use the ATMega4809 -- which is actually the main chip on the Arduino Nano Every. That chip has more program memory than the basic ATMega328 (on basic Arduino). However, I was using the DIP version of the chip because the Nano Every was like $10 and the chip was $4. I had to learn to use Microchip studio and an ICE programmer to program the chip. Had to learn all kinds of things about the chip. But I still hadn't learned assembly on the chip and I'd like too, but there are huge gaping holes in my assembly understanding so I was hoping learning some on Linux would help me learn it on other platforms. I also found it so fascinating that putting a voltage on a pin of an Arduino caused the chip to do some specific thing. I like that assembly makes me think in such a different way than high level languages do and that makes me think of new things to try.
raddevus wrote:
I also found it so fascinating that putting a voltage on a pin of an Arduino caused the chip to do some specific thing.
Then don't get close to a PLC or you will flip out :rolleyes: :laugh:
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
-
Jeff Duntemann just released the 4th ed of his fantastic assembly language book and I've started reading it. x64 Assembly Language Step-By-Step: Programming With Linux[^] I read the first edition of this book way back in '93[^]. It was sitting on a shelf at work and no one had read it. This book has quite a lot of history in it and it's a really great read. The author was the first who helped me understand math in different number bases. He does a great job of explaining things simply. Have any of you read any versions of the book? It's a great read. It's interesting to follow a book for over 30 years. Wow!
Long back, started reading this book, but as with most other books, did not complete it. [Art of Assembly Language Programming and HLA by Randall Hyde](https://www.randallhyde.com/AssemblyLanguage/www.artofasm.com/Windows/index.html)
-
On a given processor / hardware? Or because the designers decided that less than 640 K would be enough for everybody? OS/2 certainly didn't predate large-machine virtual memory, but am fairly convinced that it did predate widespread hardware support for virtual memory management on the x86 architecture.OS/2 was designed to run at pre-386 architectures, wasn't it? Correct me if I am wrong! Nevertheless, an 18,000 lines driver for a PC was rather massive at that time!
trønderen wrote:
OS/2 certainly didn't predate large-machine virtual memory, but am fairly convinced that it did predate widespread hardware support for virtual memory management on the x86 architecture. OS/2 was designed to run at pre-386 architectures, wasn't it? Correct me if I am wrong!
OS/2 1.x could run on an 80286, and could run a single instance of DOS programs (non-multitasked). It had segment-level virtual memory (swapping out entire segments at a time). OS/2 2.x and later ran on 80386 and above, used the page-level virtual memory system, and could run multiple instances of DOS (multitasked).
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.
-
When I was teaching various aspects of programming and data communication at a Tech College in the early 1990s, we saw it as essential that the programming students had at least some understanding of what happened to their programs after being compiled. So I taught a course in elementary computer architecture - the ideas of ALU functions, registers, busses, instruction and addressing formats ... No implementation technology, only those aspects relevant to the software developer. In this course, homework assignments were x86 assembly coding. (I lost the battle to get some 68K machines to the college, which would have been a great advantage in teaching clean, non-messy architectures. But I lost.) We certainly did not intend to teach the students assembly as a viable development tool; its primary purpose was to give them a 'hands on' feeling of the implications and limitations of a processor architecture. I still believe that this is The Essential Aspect of learning assembler coding: To understand what a CPU is really like. You can take advantage of that understanding when writing high level code.
We used 68008 for programming a self built computer. CPU, memory,EProm (UV erase), RS-232 serial port. circa 1988. I remember one lab where they forced us to use 3 levels of subroutines where each level used a different parameter passing approach. pass by value, then by pointer, then by pointer to pointer. From version 1 to version 20 that finally worked I might have had one op code different. Really makes you understand and appreciate how the higher level languages work. For example, if you do not understand pointers, then there is no way you understand Java object “references”. 68000 LEA op code is stuck in my brain forever!
-
Jeff Duntemann just released the 4th ed of his fantastic assembly language book and I've started reading it. x64 Assembly Language Step-By-Step: Programming With Linux[^] I read the first edition of this book way back in '93[^]. It was sitting on a shelf at work and no one had read it. This book has quite a lot of history in it and it's a really great read. The author was the first who helped me understand math in different number bases. He does a great job of explaining things simply. Have any of you read any versions of the book? It's a great read. It's interesting to follow a book for over 30 years. Wow!
I don’t remember the first assembly title I read, but I remember the analogy it used for memory. —— There is a very long street with all of the mail boxes on one side. Each mailbox has an address and holds some information/data or an instruction/task. You start by opening mailbox 0/1 which will ALWAYS contain an instruction. …
-
An 18,000 line driver in those days? That sounds rather extreme, for a driver. I mean, after Bill Gates had granted 640K of the 1M address space for application code, which should be enough for everybody, only 384K is left for the OS and drivers. Most instructions take up 2 or 3 bytes. Assuming that most of your 18,000 lines were instructions, then your driver would alone fill something like 10% of the total system space in RAM! Admittedly: I never studied half an OS. The memory limits were probably less constrained than in DOS - but the physical memory was still limited in those days, and for the most parts, drivers need to stay resident. Sidetrack: It is a long time since I heard to old professor emeritus'es referred to as 'TSRs'. Young people of today never learned that term. :-)
OS/2 Warp 3.0 and later supported a large amount of RAM in a linear, 32-bit address space. Our machines at the time had 16 to 64 MB. The driver was large because the hardware was complex and needed to support many operations in near real-time. As a result, a lot of what you might normally think of as application functionality was implemented in the driver.
Software Zen:
delete this;