The Thrill of 520kB of RAM
-
I cut my teeth on a 6502 processor. 8 bit and 65536 bytes of RAM to play with. Later on I moved to 16 and finally 32 bit. As computers get more sophisticated and accordingly more complicated looking back I realize part of me enjoys them less. Don't get me wrong - garbage collected code is great and not having to worry about out of memory exceptions on a modern OS under most circumstances is a huge win that I think many of us probably take for granted. I like to bit twiddle though. Don't you? I liked it when I had to come up with something crafty to make it even work. It's a challenge, and it's more hacking than coding. Recently, I had to change the timing in the driver code for a display I was using because it had never been coded for my IoT CPU. Even with "higher level" network stuff like REST communication that element of hacking your way through without anything to spare is still there - the other day I had to roll my own HTTP chunked transfer encoding mechanism for batch uploading JSON data from an IoT device, because I didn't have enough RAM to load the logged data into memory so I could generate a Content-Length header for the upload. All these little skills I picked up coding in the 80s for getting things done without much to work with basically atrophied after years of not only not using them but *avoiding* them. We're not supposed to hack unless we have to. We're not supposed to be "clever" But here we are, full circle, with these little tiny SoC gadgets where it's all necessary again. And I love it. :)
Real programmers use butterflies
64K of RAM. You were spoiled. I remember using MC6805 processors that had just under 2K EEPROM and 112 bytes of RAM, including stack. Did some amazing things with this, including a Point-of-Sale dual printer driver that controlled everything, including firing the print head, fonts, 4 different print hardware, bit-banged serial port. Everything was interrupt driven state-machines to reduce RAM requirements and provide multi-tasking. I even had to use some interrupt vectors for code space. It was glorious fun.
"Time flies like an arrow. Fruit flies like a banana."
-
64K of RAM. You were spoiled. I remember using MC6805 processors that had just under 2K EEPROM and 112 bytes of RAM, including stack. Did some amazing things with this, including a Point-of-Sale dual printer driver that controlled everything, including firing the print head, fonts, 4 different print hardware, bit-banged serial port. Everything was interrupt driven state-machines to reduce RAM requirements and provide multi-tasking. I even had to use some interrupt vectors for code space. It was glorious fun.
"Time flies like an arrow. Fruit flies like a banana."
well when you start to add the libraries for bluetooth , wifi, the touch screen, the SD, the Fat32 filesystem for the SD, etc that 520kB RAM isn't much, but 112 bytes is insane. :laugh:
Real programmers use butterflies
-
I cut my teeth on a 6502 processor. 8 bit and 65536 bytes of RAM to play with. Later on I moved to 16 and finally 32 bit. As computers get more sophisticated and accordingly more complicated looking back I realize part of me enjoys them less. Don't get me wrong - garbage collected code is great and not having to worry about out of memory exceptions on a modern OS under most circumstances is a huge win that I think many of us probably take for granted. I like to bit twiddle though. Don't you? I liked it when I had to come up with something crafty to make it even work. It's a challenge, and it's more hacking than coding. Recently, I had to change the timing in the driver code for a display I was using because it had never been coded for my IoT CPU. Even with "higher level" network stuff like REST communication that element of hacking your way through without anything to spare is still there - the other day I had to roll my own HTTP chunked transfer encoding mechanism for batch uploading JSON data from an IoT device, because I didn't have enough RAM to load the logged data into memory so I could generate a Content-Length header for the upload. All these little skills I picked up coding in the 80s for getting things done without much to work with basically atrophied after years of not only not using them but *avoiding* them. We're not supposed to hack unless we have to. We're not supposed to be "clever" But here we are, full circle, with these little tiny SoC gadgets where it's all necessary again. And I love it. :)
Real programmers use butterflies
-
Okay so I worded that badly! :laugh:
Real programmers use butterflies
-
I remember doing embedded applications and we only had 4K of RAM. That was a real thrill.
So many years of programming I have forgotten more languages than I know.
4K? Luxury!!