Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Did anyone here ever race the electron beam? And did you win the race?

Did anyone here ever race the electron beam? And did you win the race?

Scheduled Pinned Locked Moved The Lounge
graphicsdebuggingtoolsperformancequestion
26 Posts 16 Posters 1 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C CodeWraith

    Racing the beam is a way to put graphics on a screen without having any graphics memory. It was used in the 1970s when memory chips still had a tiny capacity and cost their weight in gold or more. The Atari VCS is a well known console that used this. It had only 128 bytes of RAM, the programs were on ROMs in the cartridges, so no room at all for any video buffer. It is called 'racing the beam' because most of the time the processor is busy staying ahead of the electron beam of the CRT monitor and putting the graphics data that will be displayed next directly into the registers of the graphics chips just in time. Be to quick or too slow and you have only garbage on the screen. And such luxuries as actual gameplay had to wait until the graphics chip was done with the current frame and entered the vertical blank period before starting with the next frame. Horrible fragile code and a nightmare to debug. Even proper debugging tools as we know them did not exist yet. But programmers who can deal with such old stuff are afraid of nothing.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    U Offline
    U Offline
    User 11907673
    wrote on last edited by
    #16

    Wow, this really took me back 40+ years. While I never "raced the beam" I did plenty of other coding around interrupts (actual REAL interrupts, not the software abstracted ones of today) versus syncing up with the code.

    C 1 Reply Last reply
    0
    • C CodeWraith

      Racing the beam is a way to put graphics on a screen without having any graphics memory. It was used in the 1970s when memory chips still had a tiny capacity and cost their weight in gold or more. The Atari VCS is a well known console that used this. It had only 128 bytes of RAM, the programs were on ROMs in the cartridges, so no room at all for any video buffer. It is called 'racing the beam' because most of the time the processor is busy staying ahead of the electron beam of the CRT monitor and putting the graphics data that will be displayed next directly into the registers of the graphics chips just in time. Be to quick or too slow and you have only garbage on the screen. And such luxuries as actual gameplay had to wait until the graphics chip was done with the current frame and entered the vertical blank period before starting with the next frame. Horrible fragile code and a nightmare to debug. Even proper debugging tools as we know them did not exist yet. But programmers who can deal with such old stuff are afraid of nothing.

      I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

      J Offline
      J Offline
      Juan Pablo Reyes Altamirano
      wrote on last edited by
      #17

      Fun times, sounds like. I never got that far back. I coded 6502 asm but Nintendo's hardware (NES) already had a pretty slick PPU that made timing somewhat easy. Still coded in ARM32 asm for the DSi but that was mostly to do basic geometric transforms in the weird video memory it had (bitmaps were not stored in a straightforward manner).

      C 1 Reply Last reply
      0
      • C CodeWraith

        Racing the beam is a way to put graphics on a screen without having any graphics memory. It was used in the 1970s when memory chips still had a tiny capacity and cost their weight in gold or more. The Atari VCS is a well known console that used this. It had only 128 bytes of RAM, the programs were on ROMs in the cartridges, so no room at all for any video buffer. It is called 'racing the beam' because most of the time the processor is busy staying ahead of the electron beam of the CRT monitor and putting the graphics data that will be displayed next directly into the registers of the graphics chips just in time. Be to quick or too slow and you have only garbage on the screen. And such luxuries as actual gameplay had to wait until the graphics chip was done with the current frame and entered the vertical blank period before starting with the next frame. Horrible fragile code and a nightmare to debug. Even proper debugging tools as we know them did not exist yet. But programmers who can deal with such old stuff are afraid of nothing.

        I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

        M Offline
        M Offline
        Matthew Barnett
        wrote on last edited by
        #18

        I'm surprised no-one's mentioned ZX80 and ZX81. They didn't have specialised circuitry to handle the display, but, instead, had the Z80 execute the contents of the screen, ensuring that the Z80 itself saw only NOP until the end of the line. The contents of the data bus (the actual character codes) were then fed to the character generator.

        U 1 Reply Last reply
        0
        • C CodeWraith

          Racing the beam is a way to put graphics on a screen without having any graphics memory. It was used in the 1970s when memory chips still had a tiny capacity and cost their weight in gold or more. The Atari VCS is a well known console that used this. It had only 128 bytes of RAM, the programs were on ROMs in the cartridges, so no room at all for any video buffer. It is called 'racing the beam' because most of the time the processor is busy staying ahead of the electron beam of the CRT monitor and putting the graphics data that will be displayed next directly into the registers of the graphics chips just in time. Be to quick or too slow and you have only garbage on the screen. And such luxuries as actual gameplay had to wait until the graphics chip was done with the current frame and entered the vertical blank period before starting with the next frame. Horrible fragile code and a nightmare to debug. Even proper debugging tools as we know them did not exist yet. But programmers who can deal with such old stuff are afraid of nothing.

          I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

          P Offline
          P Offline
          Peter Shaw
          wrote on last edited by
          #19

          I wouldn't necessarily say "Racing the Beam" but I had plenty of fun on the C64 trying to program ever more complex "Copper Effects", and on the BBC trying to cram as much 6845CRTC hardware trickery as I could inside the VBI fly-back event. Did some interesting experiments when I first bought this book: [Graphics Programming Black Book Special Edition (with CD-ROM): Amazon.co.uk: Abrash, Michael: 9781576101742: Books](https://www.amazon.co.uk/Graphics-Programming-Black-Special-CD-ROM/dp/1576101746) Lot's of juicy DOS/PC based hardware graphics trickery, a lot of which is frame rate & frame line intensive with timing etc. FUN TIMES!!!

          C 1 Reply Last reply
          0
          • P Peter Shaw

            I wouldn't necessarily say "Racing the Beam" but I had plenty of fun on the C64 trying to program ever more complex "Copper Effects", and on the BBC trying to cram as much 6845CRTC hardware trickery as I could inside the VBI fly-back event. Did some interesting experiments when I first bought this book: [Graphics Programming Black Book Special Edition (with CD-ROM): Amazon.co.uk: Abrash, Michael: 9781576101742: Books](https://www.amazon.co.uk/Graphics-Programming-Black-Special-CD-ROM/dp/1576101746) Lot's of juicy DOS/PC based hardware graphics trickery, a lot of which is frame rate & frame line intensive with timing etc. FUN TIMES!!!

            C Offline
            C Offline
            CodeWraith
            wrote on last edited by
            #20

            I have that book in the shelf myself. A shame that all that pedal to the metal assembly code has been replaced by hard wired logic in some graphics processor since then. But things have been that way all along. Turns out that even my first computer kindof raced the electron beam, but it was automated via interrupts and DMA so all I had to learn was how to set up an interrupt routine and had not to worry about how to put pixels on the screen anymore. The computer's design goes back to 1976. At that time having graphics at all were a complicated and expensive affair, so usually little single board computer kits did not have any at all. To make it not only affordable, but also relatively uncomplicated to use was a small wonder.

            I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

            P 1 Reply Last reply
            0
            • J Juan Pablo Reyes Altamirano

              Fun times, sounds like. I never got that far back. I coded 6502 asm but Nintendo's hardware (NES) already had a pretty slick PPU that made timing somewhat easy. Still coded in ARM32 asm for the DSi but that was mostly to do basic geometric transforms in the weird video memory it had (bitmaps were not stored in a straightforward manner).

              C Offline
              C Offline
              CodeWraith
              wrote on last edited by
              #21

              The memory addressing logic was designed to let whatever graphics hardware you had access its video buffer quickly, not for the programmer's convenience. The hardware was racing the electron beam for you, so there was little time to waste. And then there is also the old problem of how to synchronize CPU and graphics hardware access to the same memory.

              I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

              1 Reply Last reply
              0
              • U User 11907673

                Wow, this really took me back 40+ years. While I never "raced the beam" I did plenty of other coding around interrupts (actual REAL interrupts, not the software abstracted ones of today) versus syncing up with the code.

                C Offline
                C Offline
                CodeWraith
                wrote on last edited by
                #22

                Interrupts - what a luxury. On my old box I do bit banged serial communication without a UART. Currently all is well at 19200 baud. 38400 works for single bytes, but not for larger memory blocks. The timing error in the delay loops obviously accumulates too much when too many bytes are sent at once. Maybe I can resynchronize at every start bit, but I would have to overclock the old 8 bit processor a little more. It could go faster than 8 MHz if I raised the processor's core voltage above 5V. Perhaps it would then even become noticably warm and the processor would actually require cooling.

                I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                U 1 Reply Last reply
                0
                • J jochance

                  IIRC, the NES one and some others will not work with modern TVs because they depended heavily on the CRT tech itself. We thought of it as "shooting the TV" but I might recall (or confuse with something else) reading the gun was technically being shot by the beam from the TV and then sending the angle of that back to the console and it would deduce from that where the gun was aimed. There are newer ones (Wii and PS4 had them for sure) but they are based on different techs. The PS4 had these big balls of light on the controllers and a camera watched them. The Wii used some kind of IR system with a bar you put in front of the TV. I suppose games are still "racing the ray" in a sense. It's just you don't get garbage on screen, rather, a frozen screen/choppy framerate if frames aren't coming fast enough.

                  C Offline
                  C Offline
                  CodeWraith
                  wrote on last edited by
                  #23

                  Light guns (or light pens) Were not very complicated. you could build them yourself with some cheap parts from Radio shack. All you basically needed was a photocell, a button and a toy gun to put these into. Many graphics chips simply had registers that told the current position of the electron beam. When the sensor in the gun detected the electron beam, it must be pointed at exactly these screen coordinates. No wild calculation of angles or anything like that. But this of course does not work when there is no electron beam to detect.

                  I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                  1 Reply Last reply
                  0
                  • C CodeWraith

                    I have that book in the shelf myself. A shame that all that pedal to the metal assembly code has been replaced by hard wired logic in some graphics processor since then. But things have been that way all along. Turns out that even my first computer kindof raced the electron beam, but it was automated via interrupts and DMA so all I had to learn was how to set up an interrupt routine and had not to worry about how to put pixels on the screen anymore. The computer's design goes back to 1976. At that time having graphics at all were a complicated and expensive affair, so usually little single board computer kits did not have any at all. To make it not only affordable, but also relatively uncomplicated to use was a small wonder.

                    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                    P Offline
                    P Offline
                    Peter Shaw
                    wrote on last edited by
                    #24

                    I may have to build myself an MS-Dos (Circa 1995) computer, just so I can do some nostalgic programming again. I do still have a set of Masm32 disks somewhere :-)

                    1 Reply Last reply
                    0
                    • M Matthew Barnett

                      I'm surprised no-one's mentioned ZX80 and ZX81. They didn't have specialised circuitry to handle the display, but, instead, had the Z80 execute the contents of the screen, ensuring that the Z80 itself saw only NOP until the end of the line. The contents of the data bus (the actual character codes) were then fed to the character generator.

                      U Offline
                      U Offline
                      User 11907673
                      wrote on last edited by
                      #25

                      Z80! Haven't heard that for almost 40 years, I used to write assembler for that, too. You know this really means we are all a bunch of old farts!

                      1 Reply Last reply
                      0
                      • C CodeWraith

                        Interrupts - what a luxury. On my old box I do bit banged serial communication without a UART. Currently all is well at 19200 baud. 38400 works for single bytes, but not for larger memory blocks. The timing error in the delay loops obviously accumulates too much when too many bytes are sent at once. Maybe I can resynchronize at every start bit, but I would have to overclock the old 8 bit processor a little more. It could go faster than 8 MHz if I raised the processor's core voltage above 5V. Perhaps it would then even become noticably warm and the processor would actually require cooling.

                        I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                        U Offline
                        U Offline
                        User 11907673
                        wrote on last edited by
                        #26

                        Impressive! I never went below the UART interrupt level. Your mention of 8MHz reminded me of one of the things I tell my once a week Data Structures class that I teach - namely, that when I first started working with microprocessors the clock rate was in KHz, not GHz as they now are.

                        1 Reply Last reply
                        0
                        Reply
                        • Reply as topic
                        Log in to reply
                        • Oldest to Newest
                        • Newest to Oldest
                        • Most Votes


                        • Login

                        • Don't have an account? Register

                        • Login or register to search.
                        • First post
                          Last post
                        0
                        • Categories
                        • Recent
                        • Tags
                        • Popular
                        • World
                        • Users
                        • Groups