Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. The Thrill of 520kB of RAM

The Thrill of 520kB of RAM

Scheduled Pinned Locked Moved The Lounge
jsoncsssysadminiotperformance
25 Posts 10 Posters 2 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • R RickZeeland

    Quote:

    these little tiny SoC gadgets

    Ah dangerous stuff: Special Operations Command! :-\

    honey the codewitchH Offline
    honey the codewitchH Offline
    honey the codewitch
    wrote on last edited by
    #5

    I think that would be SOC technically. :)

    Real programmers use butterflies

    1 Reply Last reply
    0
    • honey the codewitchH honey the codewitch

      It's not so much for patting myself on the back, although I do like the sense of accomplishment. Really, I enjoy the challenge.

      Real programmers use butterflies

      R Offline
      R Offline
      Ron Anders
      wrote on last edited by
      #6

      Beats the #$%^& out of crocheting. :thumbsup:

      honey the codewitchH 1 Reply Last reply
      0
      • R Ron Anders

        Beats the #$%^& out of crocheting. :thumbsup:

        honey the codewitchH Offline
        honey the codewitchH Offline
        honey the codewitch
        wrote on last edited by
        #7

        Plus I can't crochet. That is sorcery. I'm more about the witchcraft.

        Real programmers use butterflies

        1 Reply Last reply
        0
        • honey the codewitchH honey the codewitch

          I cut my teeth on a 6502 processor. 8 bit and 65536 bytes of RAM to play with. Later on I moved to 16 and finally 32 bit. As computers get more sophisticated and accordingly more complicated looking back I realize part of me enjoys them less. Don't get me wrong - garbage collected code is great and not having to worry about out of memory exceptions on a modern OS under most circumstances is a huge win that I think many of us probably take for granted. I like to bit twiddle though. Don't you? I liked it when I had to come up with something crafty to make it even work. It's a challenge, and it's more hacking than coding. Recently, I had to change the timing in the driver code for a display I was using because it had never been coded for my IoT CPU. Even with "higher level" network stuff like REST communication that element of hacking your way through without anything to spare is still there - the other day I had to roll my own HTTP chunked transfer encoding mechanism for batch uploading JSON data from an IoT device, because I didn't have enough RAM to load the logged data into memory so I could generate a Content-Length header for the upload. All these little skills I picked up coding in the 80s for getting things done without much to work with basically atrophied after years of not only not using them but *avoiding* them. We're not supposed to hack unless we have to. We're not supposed to be "clever" But here we are, full circle, with these little tiny SoC gadgets where it's all necessary again. And I love it. :)

          Real programmers use butterflies

          Mike HankeyM Offline
          Mike HankeyM Offline
          Mike Hankey
          wrote on last edited by
          #8

          Yeah its a lot of fun playing with retro devices. I recently built a z80 board and loaded CPM on it. What a rush. Haven't had to play wit it or any of my other devices lately, been working 4 months solid remodeling a house but nearly done so looking forward to getting back into it.

          I'm not sure how many cookies it makes to be happy, but so far it's not 27. JaxCoder.com

          honey the codewitchH 1 Reply Last reply
          0
          • Mike HankeyM Mike Hankey

            Yeah its a lot of fun playing with retro devices. I recently built a z80 board and loaded CPM on it. What a rush. Haven't had to play wit it or any of my other devices lately, been working 4 months solid remodeling a house but nearly done so looking forward to getting back into it.

            I'm not sure how many cookies it makes to be happy, but so far it's not 27. JaxCoder.com

            honey the codewitchH Offline
            honey the codewitchH Offline
            honey the codewitch
            wrote on last edited by
            #9

            I have a friend who I grew up coding with who is a fan of retrocomputing and I've been trying to pique his interest on IoT gadgets in part because it uses many of the same skills and build useful things with it. Plus there's even money in it.

            Real programmers use butterflies

            Mike HankeyM 1 Reply Last reply
            0
            • honey the codewitchH honey the codewitch

              I cut my teeth on a 6502 processor. 8 bit and 65536 bytes of RAM to play with. Later on I moved to 16 and finally 32 bit. As computers get more sophisticated and accordingly more complicated looking back I realize part of me enjoys them less. Don't get me wrong - garbage collected code is great and not having to worry about out of memory exceptions on a modern OS under most circumstances is a huge win that I think many of us probably take for granted. I like to bit twiddle though. Don't you? I liked it when I had to come up with something crafty to make it even work. It's a challenge, and it's more hacking than coding. Recently, I had to change the timing in the driver code for a display I was using because it had never been coded for my IoT CPU. Even with "higher level" network stuff like REST communication that element of hacking your way through without anything to spare is still there - the other day I had to roll my own HTTP chunked transfer encoding mechanism for batch uploading JSON data from an IoT device, because I didn't have enough RAM to load the logged data into memory so I could generate a Content-Length header for the upload. All these little skills I picked up coding in the 80s for getting things done without much to work with basically atrophied after years of not only not using them but *avoiding* them. We're not supposed to hack unless we have to. We're not supposed to be "clever" But here we are, full circle, with these little tiny SoC gadgets where it's all necessary again. And I love it. :)

              Real programmers use butterflies

              G Offline
              G Offline
              Gary R Wheeler
              wrote on last edited by
              #10

              One of the best pieces of coding fun I've had in the last ten years was a project using a Microchip Technology PIC microcontroller. 8-bits, and IIRC 256 bytes of RAM and 2KB of ROM. The code fit on a single printed page. Timing requirements were such that the comments for each line of code included the number of clock cycles required for the instruction.

              Software Zen: delete this;

              honey the codewitchH 1 Reply Last reply
              0
              • G Gary R Wheeler

                One of the best pieces of coding fun I've had in the last ten years was a project using a Microchip Technology PIC microcontroller. 8-bits, and IIRC 256 bytes of RAM and 2KB of ROM. The code fit on a single printed page. Timing requirements were such that the comments for each line of code included the number of clock cycles required for the instruction.

                Software Zen: delete this;

                honey the codewitchH Offline
                honey the codewitchH Offline
                honey the codewitch
                wrote on last edited by
                #11

                Gary R. Wheeler wrote:

                Timing requirements were such that the comments for each line of code included the number of clock cycles required for the instruction.

                That reminds me of graphics coding on the original Nintendo. :-D

                Real programmers use butterflies

                G 1 Reply Last reply
                0
                • honey the codewitchH honey the codewitch

                  I have a friend who I grew up coding with who is a fan of retrocomputing and I've been trying to pique his interest on IoT gadgets in part because it uses many of the same skills and build useful things with it. Plus there's even money in it.

                  Real programmers use butterflies

                  Mike HankeyM Offline
                  Mike HankeyM Offline
                  Mike Hankey
                  wrote on last edited by
                  #12

                  I got a friend of mine interested in electronics and embedded devices and for about 3 years he feverishly learned and built but then got into ham radios and doesn't do much with it anymore.

                  I'm not sure how many cookies it makes to be happy, but so far it's not 27. JaxCoder.com

                  honey the codewitchH 1 Reply Last reply
                  0
                  • honey the codewitchH honey the codewitch

                    Gary R. Wheeler wrote:

                    Timing requirements were such that the comments for each line of code included the number of clock cycles required for the instruction.

                    That reminds me of graphics coding on the original Nintendo. :-D

                    Real programmers use butterflies

                    G Offline
                    G Offline
                    Gary R Wheeler
                    wrote on last edited by
                    #13

                    honey the codewitch wrote:

                    That reminds me of graphics coding on the original Nintendo

                    I had another project back in the late 1980's. We found an algorithm for converting a CMYK measured gamut to RGB via interpolation. The gamut was to used to perform fine-grained correction between a scanned original image and the 35mm film we were imaging on. I was responsible for implementing the essential algorithm in software. I knew the customer eventually wanted a hardware-accelerated version, as the software version took minutes to process an image. I spent several days refactoring the code to emulate a potential hardware implementation. When I was done, the image processing time was under a second, and the hardware 'implementation' only required a few 100KB of RAM for coefficient tables compute from the 'raw' gamut data. I believe an actual hardware implementation would have been usable in real-time. Unfortunately the project ran out of money before we got that far :(( .

                    Software Zen: delete this;

                    honey the codewitchH 1 Reply Last reply
                    0
                    • honey the codewitchH honey the codewitch

                      I cut my teeth on a 6502 processor. 8 bit and 65536 bytes of RAM to play with. Later on I moved to 16 and finally 32 bit. As computers get more sophisticated and accordingly more complicated looking back I realize part of me enjoys them less. Don't get me wrong - garbage collected code is great and not having to worry about out of memory exceptions on a modern OS under most circumstances is a huge win that I think many of us probably take for granted. I like to bit twiddle though. Don't you? I liked it when I had to come up with something crafty to make it even work. It's a challenge, and it's more hacking than coding. Recently, I had to change the timing in the driver code for a display I was using because it had never been coded for my IoT CPU. Even with "higher level" network stuff like REST communication that element of hacking your way through without anything to spare is still there - the other day I had to roll my own HTTP chunked transfer encoding mechanism for batch uploading JSON data from an IoT device, because I didn't have enough RAM to load the logged data into memory so I could generate a Content-Length header for the upload. All these little skills I picked up coding in the 80s for getting things done without much to work with basically atrophied after years of not only not using them but *avoiding* them. We're not supposed to hack unless we have to. We're not supposed to be "clever" But here we are, full circle, with these little tiny SoC gadgets where it's all necessary again. And I love it. :)

                      Real programmers use butterflies

                      D Offline
                      D Offline
                      Daniel Pfeffer
                      wrote on last edited by
                      #14

                      honey the codewitch wrote:

                      All these little skills I picked up coding in the 80s for getting things done without much to work with

                      I used to enjoy the bit-twiddling and extreme optimization, too. I still do, when coding for my own enjoyment. OTOH, we are professional programmers. It is our duty to solve any problem in the quickest and cheapest way that meets the requirements. Given two options: 1. Use a low-powered system. Program it in C with all possible performance-enhancing tricks, with the risks associated with no memory management, etc. Completion expected in one year, with a high risk of delays. 2. Use a higher-powered system. Program it in C++, C#, or Java, with powerful libraries and proper memory management. Completion expected in 6 months, with a medium risk of delays. I would say that the equation of "time to market + risk" vs "development and production platforms" is not one that can be decided by us. In some cases (e.g. small runs or one-offs), the client will opt to throw hardware at the problem, and get a faster (and possibly cheaper in the long run) solution. In others (e.g. high volume production), the cost of the production hardware trumps everything. In still other cases (e.g. a demo for an expo), meeting the schedule is all important.

                      Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.

                      honey the codewitchH 1 Reply Last reply
                      0
                      • D Daniel Pfeffer

                        honey the codewitch wrote:

                        All these little skills I picked up coding in the 80s for getting things done without much to work with

                        I used to enjoy the bit-twiddling and extreme optimization, too. I still do, when coding for my own enjoyment. OTOH, we are professional programmers. It is our duty to solve any problem in the quickest and cheapest way that meets the requirements. Given two options: 1. Use a low-powered system. Program it in C with all possible performance-enhancing tricks, with the risks associated with no memory management, etc. Completion expected in one year, with a high risk of delays. 2. Use a higher-powered system. Program it in C++, C#, or Java, with powerful libraries and proper memory management. Completion expected in 6 months, with a medium risk of delays. I would say that the equation of "time to market + risk" vs "development and production platforms" is not one that can be decided by us. In some cases (e.g. small runs or one-offs), the client will opt to throw hardware at the problem, and get a faster (and possibly cheaper in the long run) solution. In others (e.g. high volume production), the cost of the production hardware trumps everything. In still other cases (e.g. a demo for an expo), meeting the schedule is all important.

                        Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.

                        honey the codewitchH Offline
                        honey the codewitchH Offline
                        honey the codewitch
                        wrote on last edited by
                        #15

                        I agree with you, and that's normally true that hardware is cheaper than software. The situations is more complicated for IoT devices. They cannot scale out, and sometimes you need their size and power requirements. I am building something for a client on just such a platform now. Bit twiddling is necessary. They're paying for the privilege of running on one of these little guys. That's how I look at it. I love doing it too. It means what I used to only be able to do for fun I get to do professionally now. :)

                        Real programmers use butterflies

                        1 Reply Last reply
                        0
                        • G Gary R Wheeler

                          honey the codewitch wrote:

                          That reminds me of graphics coding on the original Nintendo

                          I had another project back in the late 1980's. We found an algorithm for converting a CMYK measured gamut to RGB via interpolation. The gamut was to used to perform fine-grained correction between a scanned original image and the 35mm film we were imaging on. I was responsible for implementing the essential algorithm in software. I knew the customer eventually wanted a hardware-accelerated version, as the software version took minutes to process an image. I spent several days refactoring the code to emulate a potential hardware implementation. When I was done, the image processing time was under a second, and the hardware 'implementation' only required a few 100KB of RAM for coefficient tables compute from the 'raw' gamut data. I believe an actual hardware implementation would have been usable in real-time. Unfortunately the project ran out of money before we got that far :(( .

                          Software Zen: delete this;

                          honey the codewitchH Offline
                          honey the codewitchH Offline
                          honey the codewitch
                          wrote on last edited by
                          #16

                          So many good ideas left on the cutting room floor because of ruthless beancounters. :((

                          Real programmers use butterflies

                          1 Reply Last reply
                          0
                          • Mike HankeyM Mike Hankey

                            I got a friend of mine interested in electronics and embedded devices and for about 3 years he feverishly learned and built but then got into ham radios and doesn't do much with it anymore.

                            I'm not sure how many cookies it makes to be happy, but so far it's not 27. JaxCoder.com

                            honey the codewitchH Offline
                            honey the codewitchH Offline
                            honey the codewitch
                            wrote on last edited by
                            #17

                            HAM is a black hole for hobbyists!! :laugh:

                            Real programmers use butterflies

                            Mike HankeyM 1 Reply Last reply
                            0
                            • honey the codewitchH honey the codewitch

                              I cut my teeth on a 6502 processor. 8 bit and 65536 bytes of RAM to play with. Later on I moved to 16 and finally 32 bit. As computers get more sophisticated and accordingly more complicated looking back I realize part of me enjoys them less. Don't get me wrong - garbage collected code is great and not having to worry about out of memory exceptions on a modern OS under most circumstances is a huge win that I think many of us probably take for granted. I like to bit twiddle though. Don't you? I liked it when I had to come up with something crafty to make it even work. It's a challenge, and it's more hacking than coding. Recently, I had to change the timing in the driver code for a display I was using because it had never been coded for my IoT CPU. Even with "higher level" network stuff like REST communication that element of hacking your way through without anything to spare is still there - the other day I had to roll my own HTTP chunked transfer encoding mechanism for batch uploading JSON data from an IoT device, because I didn't have enough RAM to load the logged data into memory so I could generate a Content-Length header for the upload. All these little skills I picked up coding in the 80s for getting things done without much to work with basically atrophied after years of not only not using them but *avoiding* them. We're not supposed to hack unless we have to. We're not supposed to be "clever" But here we are, full circle, with these little tiny SoC gadgets where it's all necessary again. And I love it. :)

                              Real programmers use butterflies

                              M Offline
                              M Offline
                              michaelbarb
                              wrote on last edited by
                              #18

                              I remember doing embedded applications and we only had 4K of RAM. That was a real thrill.

                              So many years of programming I have forgotten more languages than I know.

                              honey the codewitchH P 2 Replies Last reply
                              0
                              • M michaelbarb

                                I remember doing embedded applications and we only had 4K of RAM. That was a real thrill.

                                So many years of programming I have forgotten more languages than I know.

                                honey the codewitchH Offline
                                honey the codewitchH Offline
                                honey the codewitch
                                wrote on last edited by
                                #19

                                4kB is not a lot! That sounds like fun. 520kB seems like an incredible amount in comparison until you add the libraries for the bluetooth, the wifi, the touchscreen, the SD read/write, the HTTP REST client w/ JSON and the Fat32 FS to go with the SD pretty soon it's not much at all! :laugh:

                                Real programmers use butterflies

                                1 Reply Last reply
                                0
                                • honey the codewitchH honey the codewitch

                                  HAM is a black hole for hobbyists!! :laugh:

                                  Real programmers use butterflies

                                  Mike HankeyM Offline
                                  Mike HankeyM Offline
                                  Mike Hankey
                                  wrote on last edited by
                                  #20

                                  I have a special affection for ham operators, especially the MARS variety. When I was across the big pond many many years ago they provided a way for us to actually call home. It was convoluted and iffy but if everything was just so we could get 3 minutes to talk to a loved one. When you're in a foreign country thousands of miles from home, 13 months at a time with only mail, and that not reliable it was a blessing.

                                  I'm not sure how many cookies it makes to be happy, but so far it's not 27. JaxCoder.com

                                  1 Reply Last reply
                                  0
                                  • honey the codewitchH honey the codewitch

                                    I cut my teeth on a 6502 processor. 8 bit and 65536 bytes of RAM to play with. Later on I moved to 16 and finally 32 bit. As computers get more sophisticated and accordingly more complicated looking back I realize part of me enjoys them less. Don't get me wrong - garbage collected code is great and not having to worry about out of memory exceptions on a modern OS under most circumstances is a huge win that I think many of us probably take for granted. I like to bit twiddle though. Don't you? I liked it when I had to come up with something crafty to make it even work. It's a challenge, and it's more hacking than coding. Recently, I had to change the timing in the driver code for a display I was using because it had never been coded for my IoT CPU. Even with "higher level" network stuff like REST communication that element of hacking your way through without anything to spare is still there - the other day I had to roll my own HTTP chunked transfer encoding mechanism for batch uploading JSON data from an IoT device, because I didn't have enough RAM to load the logged data into memory so I could generate a Content-Length header for the upload. All these little skills I picked up coding in the 80s for getting things done without much to work with basically atrophied after years of not only not using them but *avoiding* them. We're not supposed to hack unless we have to. We're not supposed to be "clever" But here we are, full circle, with these little tiny SoC gadgets where it's all necessary again. And I love it. :)

                                    Real programmers use butterflies

                                    M Offline
                                    M Offline
                                    Matthew Dennis
                                    wrote on last edited by
                                    #21

                                    64K of RAM. You were spoiled. I remember using MC6805 processors that had just under 2K EEPROM and 112 bytes of RAM, including stack. Did some amazing things with this, including a Point-of-Sale dual printer driver that controlled everything, including firing the print head, fonts, 4 different print hardware, bit-banged serial port. Everything was interrupt driven state-machines to reduce RAM requirements and provide multi-tasking. I even had to use some interrupt vectors for code space. It was glorious fun.

                                    "Time flies like an arrow. Fruit flies like a banana."

                                    honey the codewitchH 1 Reply Last reply
                                    0
                                    • M Matthew Dennis

                                      64K of RAM. You were spoiled. I remember using MC6805 processors that had just under 2K EEPROM and 112 bytes of RAM, including stack. Did some amazing things with this, including a Point-of-Sale dual printer driver that controlled everything, including firing the print head, fonts, 4 different print hardware, bit-banged serial port. Everything was interrupt driven state-machines to reduce RAM requirements and provide multi-tasking. I even had to use some interrupt vectors for code space. It was glorious fun.

                                      "Time flies like an arrow. Fruit flies like a banana."

                                      honey the codewitchH Offline
                                      honey the codewitchH Offline
                                      honey the codewitch
                                      wrote on last edited by
                                      #22

                                      well when you start to add the libraries for bluetooth , wifi, the touch screen, the SD, the Fat32 filesystem for the SD, etc that 520kB RAM isn't much, but 112 bytes is insane. :laugh:

                                      Real programmers use butterflies

                                      1 Reply Last reply
                                      0
                                      • honey the codewitchH honey the codewitch

                                        I cut my teeth on a 6502 processor. 8 bit and 65536 bytes of RAM to play with. Later on I moved to 16 and finally 32 bit. As computers get more sophisticated and accordingly more complicated looking back I realize part of me enjoys them less. Don't get me wrong - garbage collected code is great and not having to worry about out of memory exceptions on a modern OS under most circumstances is a huge win that I think many of us probably take for granted. I like to bit twiddle though. Don't you? I liked it when I had to come up with something crafty to make it even work. It's a challenge, and it's more hacking than coding. Recently, I had to change the timing in the driver code for a display I was using because it had never been coded for my IoT CPU. Even with "higher level" network stuff like REST communication that element of hacking your way through without anything to spare is still there - the other day I had to roll my own HTTP chunked transfer encoding mechanism for batch uploading JSON data from an IoT device, because I didn't have enough RAM to load the logged data into memory so I could generate a Content-Length header for the upload. All these little skills I picked up coding in the 80s for getting things done without much to work with basically atrophied after years of not only not using them but *avoiding* them. We're not supposed to hack unless we have to. We're not supposed to be "clever" But here we are, full circle, with these little tiny SoC gadgets where it's all necessary again. And I love it. :)

                                        Real programmers use butterflies

                                        D Offline
                                        D Offline
                                        DerekT P
                                        wrote on last edited by
                                        #23

                                        honey the codewitch wrote:

                                        finally 32 bit

                                        Wow, have we got news for you! :laugh:

                                        honey the codewitchH 1 Reply Last reply
                                        0
                                        • D DerekT P

                                          honey the codewitch wrote:

                                          finally 32 bit

                                          Wow, have we got news for you! :laugh:

                                          honey the codewitchH Offline
                                          honey the codewitchH Offline
                                          honey the codewitch
                                          wrote on last edited by
                                          #24

                                          Okay so I worded that badly! :laugh:

                                          Real programmers use butterflies

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups