Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Did anyone here ever race the electron beam? And did you win the race?

Did anyone here ever race the electron beam? And did you win the race?

Scheduled Pinned Locked Moved The Lounge
graphicsdebuggingtoolsperformancequestion
26 Posts 16 Posters 1 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C CodeWraith

    Racing the beam is a way to put graphics on a screen without having any graphics memory. It was used in the 1970s when memory chips still had a tiny capacity and cost their weight in gold or more. The Atari VCS is a well known console that used this. It had only 128 bytes of RAM, the programs were on ROMs in the cartridges, so no room at all for any video buffer. It is called 'racing the beam' because most of the time the processor is busy staying ahead of the electron beam of the CRT monitor and putting the graphics data that will be displayed next directly into the registers of the graphics chips just in time. Be to quick or too slow and you have only garbage on the screen. And such luxuries as actual gameplay had to wait until the graphics chip was done with the current frame and entered the vertical blank period before starting with the next frame. Horrible fragile code and a nightmare to debug. Even proper debugging tools as we know them did not exist yet. But programmers who can deal with such old stuff are afraid of nothing.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    H Offline
    H Offline
    honey the codewitch
    wrote on last edited by
    #10

    LOL I never knew it was called that but I race the beam in IoT. Of course now I have a coprocessor to help me. RGB interface displays use 1 wire for each color bit. Those wires source bits out of RAM while the display scans up and down, left to right. You have to spit the bits at exactly the right time to get the display to show correctly. I use a little feature in the ESP32 called GDMA to make it possible, but it takes the chasing away from the CPU. Basically you can connect 1 bit to a 1 wire - up to 16 at a time, and point to a memory buffer, and the GDMA processor at a frequency you give it, will read or write data to or from that buffer using those wires. Aside from that, I've emulated racing the beam several times, building old school emulators, like Nintendo emulators. Fun times!

    Check out my IoT graphics library here: https://honeythecodewitch/gfx

    1 Reply Last reply
    0
    • D Daniel Pfeffer

      Some things are better handled by hardware... :)

      Freedom is the freedom to say that two plus two make four. If that is granted, all else follows. -- 6079 Smith W.

      J Offline
      J Offline
      jmaida
      wrote on last edited by
      #11

      Memory lane. Tektonix used to have 4010 and 4014 graphical display terminals using storage tubes as their display memory. They were like an electronic etch-a-sketch. Clear the screen then draw on it much like a line plotter. Used them a-lot in Grad school as terminal for mini-computer (rack mounted Data General Eclipse). Computer graphics was becoming a core course for computer science and math majors at the time. Anyway, point is that this was early days of graphical displays that did not require lots of expensive dedicated raster memory (those types of displays were very expensive and used mostly by CGI business (movies).

      "A little time, a little trouble, your better day" Badfinger

      1 Reply Last reply
      0
      • C CodeWraith

        Racing the beam is a way to put graphics on a screen without having any graphics memory. It was used in the 1970s when memory chips still had a tiny capacity and cost their weight in gold or more. The Atari VCS is a well known console that used this. It had only 128 bytes of RAM, the programs were on ROMs in the cartridges, so no room at all for any video buffer. It is called 'racing the beam' because most of the time the processor is busy staying ahead of the electron beam of the CRT monitor and putting the graphics data that will be displayed next directly into the registers of the graphics chips just in time. Be to quick or too slow and you have only garbage on the screen. And such luxuries as actual gameplay had to wait until the graphics chip was done with the current frame and entered the vertical blank period before starting with the next frame. Horrible fragile code and a nightmare to debug. Even proper debugging tools as we know them did not exist yet. But programmers who can deal with such old stuff are afraid of nothing.

        I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

        D Offline
        D Offline
        DerekT P
        wrote on last edited by
        #12

        Not racing the beam as such, but spotting the beam, yes. In around 1981 I built myself a UK101 kit computer; 6502 CPU, (originally) 4k memory and RF output to a monochrome television of 32 (originally 16) rows and 64 columns. Complete with full logic diagram. I'd made various mods to the system, but decided it would be cool to be able to "draw" directly on the screen. This in the days before mouse pointers, tablets, touch-screens etc. I knew the image on the CRT was a bright dot racing across the screen and figured if I had a light-sensitive diode, I could trigger a signal in response to the dot passing under it. That signal was connected to an interrupt and the interrupt processing code accessed what was effectively a hardware tick counter, that was synchronised to the clock for the video driver. Based on the value of that tick counter I could tell where the electron beam would be, and therefore I could calculate a character row and column. So long as there were some pixels in the character, and a little adjustment to the TV brightness controls, it could detect the position of the light-sensitive diode pretty accurately. Fit the diode in the end of a "wand" and, hey presto, I could draw lines on the screen. As you can see my grasp of it all was a little tenuous, but the excitement and joy when it actually worked was amazing... especially at the total cost of a few pennies and a couple of dozen lines of assembler code.

        Telegraph marker posts ... nothing to do with IT Phasmid email discussion group ... also nothing to do with IT Beekeeping and honey site ... still nothing to do with IT

        C 1 Reply Last reply
        0
        • D DerekT P

          Not racing the beam as such, but spotting the beam, yes. In around 1981 I built myself a UK101 kit computer; 6502 CPU, (originally) 4k memory and RF output to a monochrome television of 32 (originally 16) rows and 64 columns. Complete with full logic diagram. I'd made various mods to the system, but decided it would be cool to be able to "draw" directly on the screen. This in the days before mouse pointers, tablets, touch-screens etc. I knew the image on the CRT was a bright dot racing across the screen and figured if I had a light-sensitive diode, I could trigger a signal in response to the dot passing under it. That signal was connected to an interrupt and the interrupt processing code accessed what was effectively a hardware tick counter, that was synchronised to the clock for the video driver. Based on the value of that tick counter I could tell where the electron beam would be, and therefore I could calculate a character row and column. So long as there were some pixels in the character, and a little adjustment to the TV brightness controls, it could detect the position of the light-sensitive diode pretty accurately. Fit the diode in the end of a "wand" and, hey presto, I could draw lines on the screen. As you can see my grasp of it all was a little tenuous, but the excitement and joy when it actually worked was amazing... especially at the total cost of a few pennies and a couple of dozen lines of assembler code.

          Telegraph marker posts ... nothing to do with IT Phasmid email discussion group ... also nothing to do with IT Beekeeping and honey site ... still nothing to do with IT

          C Offline
          C Offline
          CodeWraith
          wrote on last edited by
          #13

          I built my first computer in 1978 and still have it. So you essentially built a light gun. I wonder, does shooting the (nonexistent) bea still work on modern monitors? do some game consoles even have light guns? It seems like I saw the last ones some time in the last millenium.

          I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

          J 1 Reply Last reply
          0
          • M Marc Clifton

            Yes, in a variety of scenarios. On the Commodore 64, you could get an interrupt at a specific raster line, and one of the geniuses I worked with figured out that you could double the apparent sprites by interrupting halfway during the vertical rendering of the screen and switch the sprite bank pointers. Flip back during vertical refresh. I also hand coded, counting 80286 instruction cycles, the assembly code necessary to flip a video digitizing board from "read" to "write". See, we had this multispectral camera with a spinning disk of 6 bandpass optical filters in front of the CCD sensor, where the rotation of the disk with the glass filters was sync'd to the vertical refresh rate of the CCD (the flip side of racing the beam). So, 1/60th second, you'd get a different image of a different filter, which was something of a visual mess when looking at different spectrum slices. I figured out how to put the digitizer board into "read" mode for one field and "write" mode for the other 5, so you could get a stable image real time of a specific filter. All that had to be done during the vertical refresh period.

            Latest Articles:
            A Lightweight Thread Safe In-Memory Keyed Generic Cache Collection Service A Dynamic Where Implementation for Entity Framework

            B Offline
            B Offline
            BBar2
            wrote on last edited by
            #14

            C64 raster interrupts were too much fun. You could change video modes, or increase the number of colors that could be displayed on the screen at one time using the raster interrupt. Way too much fun.

            1 Reply Last reply
            0
            • C CodeWraith

              I built my first computer in 1978 and still have it. So you essentially built a light gun. I wonder, does shooting the (nonexistent) bea still work on modern monitors? do some game consoles even have light guns? It seems like I saw the last ones some time in the last millenium.

              I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

              J Offline
              J Offline
              jochance
              wrote on last edited by
              #15

              IIRC, the NES one and some others will not work with modern TVs because they depended heavily on the CRT tech itself. We thought of it as "shooting the TV" but I might recall (or confuse with something else) reading the gun was technically being shot by the beam from the TV and then sending the angle of that back to the console and it would deduce from that where the gun was aimed. There are newer ones (Wii and PS4 had them for sure) but they are based on different techs. The PS4 had these big balls of light on the controllers and a camera watched them. The Wii used some kind of IR system with a bar you put in front of the TV. I suppose games are still "racing the ray" in a sense. It's just you don't get garbage on screen, rather, a frozen screen/choppy framerate if frames aren't coming fast enough.

              C 1 Reply Last reply
              0
              • C CodeWraith

                Racing the beam is a way to put graphics on a screen without having any graphics memory. It was used in the 1970s when memory chips still had a tiny capacity and cost their weight in gold or more. The Atari VCS is a well known console that used this. It had only 128 bytes of RAM, the programs were on ROMs in the cartridges, so no room at all for any video buffer. It is called 'racing the beam' because most of the time the processor is busy staying ahead of the electron beam of the CRT monitor and putting the graphics data that will be displayed next directly into the registers of the graphics chips just in time. Be to quick or too slow and you have only garbage on the screen. And such luxuries as actual gameplay had to wait until the graphics chip was done with the current frame and entered the vertical blank period before starting with the next frame. Horrible fragile code and a nightmare to debug. Even proper debugging tools as we know them did not exist yet. But programmers who can deal with such old stuff are afraid of nothing.

                I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                U Offline
                U Offline
                User 11907673
                wrote on last edited by
                #16

                Wow, this really took me back 40+ years. While I never "raced the beam" I did plenty of other coding around interrupts (actual REAL interrupts, not the software abstracted ones of today) versus syncing up with the code.

                C 1 Reply Last reply
                0
                • C CodeWraith

                  Racing the beam is a way to put graphics on a screen without having any graphics memory. It was used in the 1970s when memory chips still had a tiny capacity and cost their weight in gold or more. The Atari VCS is a well known console that used this. It had only 128 bytes of RAM, the programs were on ROMs in the cartridges, so no room at all for any video buffer. It is called 'racing the beam' because most of the time the processor is busy staying ahead of the electron beam of the CRT monitor and putting the graphics data that will be displayed next directly into the registers of the graphics chips just in time. Be to quick or too slow and you have only garbage on the screen. And such luxuries as actual gameplay had to wait until the graphics chip was done with the current frame and entered the vertical blank period before starting with the next frame. Horrible fragile code and a nightmare to debug. Even proper debugging tools as we know them did not exist yet. But programmers who can deal with such old stuff are afraid of nothing.

                  I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                  J Offline
                  J Offline
                  Juan Pablo Reyes Altamirano
                  wrote on last edited by
                  #17

                  Fun times, sounds like. I never got that far back. I coded 6502 asm but Nintendo's hardware (NES) already had a pretty slick PPU that made timing somewhat easy. Still coded in ARM32 asm for the DSi but that was mostly to do basic geometric transforms in the weird video memory it had (bitmaps were not stored in a straightforward manner).

                  C 1 Reply Last reply
                  0
                  • C CodeWraith

                    Racing the beam is a way to put graphics on a screen without having any graphics memory. It was used in the 1970s when memory chips still had a tiny capacity and cost their weight in gold or more. The Atari VCS is a well known console that used this. It had only 128 bytes of RAM, the programs were on ROMs in the cartridges, so no room at all for any video buffer. It is called 'racing the beam' because most of the time the processor is busy staying ahead of the electron beam of the CRT monitor and putting the graphics data that will be displayed next directly into the registers of the graphics chips just in time. Be to quick or too slow and you have only garbage on the screen. And such luxuries as actual gameplay had to wait until the graphics chip was done with the current frame and entered the vertical blank period before starting with the next frame. Horrible fragile code and a nightmare to debug. Even proper debugging tools as we know them did not exist yet. But programmers who can deal with such old stuff are afraid of nothing.

                    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                    M Offline
                    M Offline
                    Matthew Barnett
                    wrote on last edited by
                    #18

                    I'm surprised no-one's mentioned ZX80 and ZX81. They didn't have specialised circuitry to handle the display, but, instead, had the Z80 execute the contents of the screen, ensuring that the Z80 itself saw only NOP until the end of the line. The contents of the data bus (the actual character codes) were then fed to the character generator.

                    U 1 Reply Last reply
                    0
                    • C CodeWraith

                      Racing the beam is a way to put graphics on a screen without having any graphics memory. It was used in the 1970s when memory chips still had a tiny capacity and cost their weight in gold or more. The Atari VCS is a well known console that used this. It had only 128 bytes of RAM, the programs were on ROMs in the cartridges, so no room at all for any video buffer. It is called 'racing the beam' because most of the time the processor is busy staying ahead of the electron beam of the CRT monitor and putting the graphics data that will be displayed next directly into the registers of the graphics chips just in time. Be to quick or too slow and you have only garbage on the screen. And such luxuries as actual gameplay had to wait until the graphics chip was done with the current frame and entered the vertical blank period before starting with the next frame. Horrible fragile code and a nightmare to debug. Even proper debugging tools as we know them did not exist yet. But programmers who can deal with such old stuff are afraid of nothing.

                      I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                      P Offline
                      P Offline
                      Peter Shaw
                      wrote on last edited by
                      #19

                      I wouldn't necessarily say "Racing the Beam" but I had plenty of fun on the C64 trying to program ever more complex "Copper Effects", and on the BBC trying to cram as much 6845CRTC hardware trickery as I could inside the VBI fly-back event. Did some interesting experiments when I first bought this book: [Graphics Programming Black Book Special Edition (with CD-ROM): Amazon.co.uk: Abrash, Michael: 9781576101742: Books](https://www.amazon.co.uk/Graphics-Programming-Black-Special-CD-ROM/dp/1576101746) Lot's of juicy DOS/PC based hardware graphics trickery, a lot of which is frame rate & frame line intensive with timing etc. FUN TIMES!!!

                      C 1 Reply Last reply
                      0
                      • P Peter Shaw

                        I wouldn't necessarily say "Racing the Beam" but I had plenty of fun on the C64 trying to program ever more complex "Copper Effects", and on the BBC trying to cram as much 6845CRTC hardware trickery as I could inside the VBI fly-back event. Did some interesting experiments when I first bought this book: [Graphics Programming Black Book Special Edition (with CD-ROM): Amazon.co.uk: Abrash, Michael: 9781576101742: Books](https://www.amazon.co.uk/Graphics-Programming-Black-Special-CD-ROM/dp/1576101746) Lot's of juicy DOS/PC based hardware graphics trickery, a lot of which is frame rate & frame line intensive with timing etc. FUN TIMES!!!

                        C Offline
                        C Offline
                        CodeWraith
                        wrote on last edited by
                        #20

                        I have that book in the shelf myself. A shame that all that pedal to the metal assembly code has been replaced by hard wired logic in some graphics processor since then. But things have been that way all along. Turns out that even my first computer kindof raced the electron beam, but it was automated via interrupts and DMA so all I had to learn was how to set up an interrupt routine and had not to worry about how to put pixels on the screen anymore. The computer's design goes back to 1976. At that time having graphics at all were a complicated and expensive affair, so usually little single board computer kits did not have any at all. To make it not only affordable, but also relatively uncomplicated to use was a small wonder.

                        I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                        P 1 Reply Last reply
                        0
                        • J Juan Pablo Reyes Altamirano

                          Fun times, sounds like. I never got that far back. I coded 6502 asm but Nintendo's hardware (NES) already had a pretty slick PPU that made timing somewhat easy. Still coded in ARM32 asm for the DSi but that was mostly to do basic geometric transforms in the weird video memory it had (bitmaps were not stored in a straightforward manner).

                          C Offline
                          C Offline
                          CodeWraith
                          wrote on last edited by
                          #21

                          The memory addressing logic was designed to let whatever graphics hardware you had access its video buffer quickly, not for the programmer's convenience. The hardware was racing the electron beam for you, so there was little time to waste. And then there is also the old problem of how to synchronize CPU and graphics hardware access to the same memory.

                          I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                          1 Reply Last reply
                          0
                          • U User 11907673

                            Wow, this really took me back 40+ years. While I never "raced the beam" I did plenty of other coding around interrupts (actual REAL interrupts, not the software abstracted ones of today) versus syncing up with the code.

                            C Offline
                            C Offline
                            CodeWraith
                            wrote on last edited by
                            #22

                            Interrupts - what a luxury. On my old box I do bit banged serial communication without a UART. Currently all is well at 19200 baud. 38400 works for single bytes, but not for larger memory blocks. The timing error in the delay loops obviously accumulates too much when too many bytes are sent at once. Maybe I can resynchronize at every start bit, but I would have to overclock the old 8 bit processor a little more. It could go faster than 8 MHz if I raised the processor's core voltage above 5V. Perhaps it would then even become noticably warm and the processor would actually require cooling.

                            I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                            U 1 Reply Last reply
                            0
                            • J jochance

                              IIRC, the NES one and some others will not work with modern TVs because they depended heavily on the CRT tech itself. We thought of it as "shooting the TV" but I might recall (or confuse with something else) reading the gun was technically being shot by the beam from the TV and then sending the angle of that back to the console and it would deduce from that where the gun was aimed. There are newer ones (Wii and PS4 had them for sure) but they are based on different techs. The PS4 had these big balls of light on the controllers and a camera watched them. The Wii used some kind of IR system with a bar you put in front of the TV. I suppose games are still "racing the ray" in a sense. It's just you don't get garbage on screen, rather, a frozen screen/choppy framerate if frames aren't coming fast enough.

                              C Offline
                              C Offline
                              CodeWraith
                              wrote on last edited by
                              #23

                              Light guns (or light pens) Were not very complicated. you could build them yourself with some cheap parts from Radio shack. All you basically needed was a photocell, a button and a toy gun to put these into. Many graphics chips simply had registers that told the current position of the electron beam. When the sensor in the gun detected the electron beam, it must be pointed at exactly these screen coordinates. No wild calculation of angles or anything like that. But this of course does not work when there is no electron beam to detect.

                              I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                              1 Reply Last reply
                              0
                              • C CodeWraith

                                I have that book in the shelf myself. A shame that all that pedal to the metal assembly code has been replaced by hard wired logic in some graphics processor since then. But things have been that way all along. Turns out that even my first computer kindof raced the electron beam, but it was automated via interrupts and DMA so all I had to learn was how to set up an interrupt routine and had not to worry about how to put pixels on the screen anymore. The computer's design goes back to 1976. At that time having graphics at all were a complicated and expensive affair, so usually little single board computer kits did not have any at all. To make it not only affordable, but also relatively uncomplicated to use was a small wonder.

                                I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                                P Offline
                                P Offline
                                Peter Shaw
                                wrote on last edited by
                                #24

                                I may have to build myself an MS-Dos (Circa 1995) computer, just so I can do some nostalgic programming again. I do still have a set of Masm32 disks somewhere :-)

                                1 Reply Last reply
                                0
                                • M Matthew Barnett

                                  I'm surprised no-one's mentioned ZX80 and ZX81. They didn't have specialised circuitry to handle the display, but, instead, had the Z80 execute the contents of the screen, ensuring that the Z80 itself saw only NOP until the end of the line. The contents of the data bus (the actual character codes) were then fed to the character generator.

                                  U Offline
                                  U Offline
                                  User 11907673
                                  wrote on last edited by
                                  #25

                                  Z80! Haven't heard that for almost 40 years, I used to write assembler for that, too. You know this really means we are all a bunch of old farts!

                                  1 Reply Last reply
                                  0
                                  • C CodeWraith

                                    Interrupts - what a luxury. On my old box I do bit banged serial communication without a UART. Currently all is well at 19200 baud. 38400 works for single bytes, but not for larger memory blocks. The timing error in the delay loops obviously accumulates too much when too many bytes are sent at once. Maybe I can resynchronize at every start bit, but I would have to overclock the old 8 bit processor a little more. It could go faster than 8 MHz if I raised the processor's core voltage above 5V. Perhaps it would then even become noticably warm and the processor would actually require cooling.

                                    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

                                    U Offline
                                    U Offline
                                    User 11907673
                                    wrote on last edited by
                                    #26

                                    Impressive! I never went below the UART interrupt level. Your mention of 8MHz reminded me of one of the things I tell my once a week Data Structures class that I teach - namely, that when I first started working with microprocessors the clock rate was in KHz, not GHz as they now are.

                                    1 Reply Last reply
                                    0
                                    Reply
                                    • Reply as topic
                                    Log in to reply
                                    • Oldest to Newest
                                    • Newest to Oldest
                                    • Most Votes


                                    • Login

                                    • Don't have an account? Register

                                    • Login or register to search.
                                    • First post
                                      Last post
                                    0
                                    • Categories
                                    • Recent
                                    • Tags
                                    • Popular
                                    • World
                                    • Users
                                    • Groups