Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
C

CodeWraith

@CodeWraith
About
Posts
4.4k
Topics
309
Shares
0
Groups
0
Followers
0
Following
0

Posts

Recent Best Controversial

  • I have a codeproject hued cat
    C CodeWraith

    All possible colors plus stripes, all at once. The Calico Tabby. Not breedable because some of the color information is on the X chromosome and only females have two X chromosomes. There are no Calico Tabby males you could breed them with.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge design com graphics iot lounge

  • Is that true?
    C CodeWraith

    And we probably can forget any stereotypical smoke signals. Virtue signaling, however...

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge question

  • Is that true?
    C CodeWraith

    I heard that the Vegans originally were a tribe of indians who lived near today's west coast of the US. Their name meant something like 'terrible hunters'.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge question

  • I got Shakespeare's chewed pencil...
    C CodeWraith

    You were not thrown off a bridge for that, obviously.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge javascript cloud csharp linq com

  • Hot wave...
    C CodeWraith

    And you spemt the time since then in Antarctica? Was that really the last warm day for you?

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge question

  • Hot wave...
    C CodeWraith

    At this time of the year that would be a not so hot day in Texas and I would have adopted the lifestyle of a crocodile and spent most of the day in the pool.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge question

  • Racing the electron beam once more
    C CodeWraith

    Sounds a lot like racing the beam. Just because you did not have to do that for everything anymore did not mean you could not use it to wring a few unusual effects out of your graphics hardware. Even then such things already were becoming arcane and secret knowlege.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge performance css graphics game-dev help

  • Racing the electron beam once more
    C CodeWraith

    Not if you have something like a MMU that keeps the processor blissfully unaware that it actually is roaming around in paged memory and you can call anything at any time without having to fear any complications.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge performance css graphics game-dev help

  • Racing the electron beam once more
    C CodeWraith

    Yes, times have changed. The old computer is from a time when even 16k was an expensive dream. Even any OS was a luxury. ROMs were just as tiny and there is only so much you can do with that limited space. You can't have drivers or routines for and against everything. In a paged memory model you can pack your code into modules similar to DLLs. Each module gets its own memory page as if it were the only thing running on the computer. Sound familiar? It's just giving an old processor the same royal treatment as a modern one and suddently the whole computer becomes much more modern as it has any right to be. It's all about teaching a very old dog some new tricksand lack of memory is the most common reason that speak against doing that.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge performance css graphics game-dev help

  • Racing the electron beam once more
    C CodeWraith

    Talk about first world problems. :) It's kind of easy to be virtuous as long as you have plenty. Having only the bare minimum may make you look a little stingy or greedy because you always have a good use for a little more.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge performance css graphics game-dev help

  • Racing the electron beam once more
    C CodeWraith

    Even in the old days I had a collection of subroutines to puzzle together whatever I need without rewriting everything all the time. Today I let the assembler do that dirty work for me. Just a tiny change in the configuration and I can have double buffering, sprites, a text mode and also change the resolution. The only thing I can't do is switch around these options at runtime. It would be possible, but then I would have to keep everything in memory at once and always reserve the largest buffers, just in case. Not a very economical use of the small amount of memory available. But fear not, by slightly expanding the memory by a few megabytes and figuring out a way to switch memory pages without the processor noticing anything, I can keep lots of code in memory at once and do things that were far out of reach for a little 8 bit processor. The lessons I learned from the old computer: Use your memory as good as you can and there is no such thing as enough memory. I will always find a good use for a little more, even without being wasteful.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge performance css graphics game-dev help

  • Racing the electron beam once more
    C CodeWraith

    I need a break. Time to go back when things were simple (or at least appeared to be so) and do something just for fun. Like writing a little game for my old box. A little 8 bit processor, 4k RAM, a weird little graphics chip and the assembler is all you need. But wait, this is tech from 1976! A graphics chip? Yep, we are racing the electron beam again. But they did that in such a clever way that a kid could get it to work. It involved interrupts and DMA and all code that had to stay in sync with the electron beam was contained in that less than 32 instructions long interrupt routine. However, that simplicity still does not come without a price. The graphics chip issues 1024 x 60 DMA requests every second and also calls the interrupt routine 60 times a second as well. Whatever is going on in that interrupt routine adds up very quickly to take away a good percentage of the instructions the processor can 'waste' on such luxuries as executing its program every second. Just how much, exactly? Those interrupt routines come in two flavors and we get two very different values. After all these years I have now taken the time to actually do the math: The worst case are those interrupt routines that manipulate the DMA pointer to repeat every raster line two or more times. To do that, you have to stay in the interrupt routine for the entire duration of the frame, leaving only the vertical blank period for program execution. Just as bad as racing the beam always was. At least you had a more useful vertical resolution this way and required a significantly smaller graphics buffer. Still, this left you with only 33.64% of the CPU time for your program. Ouch. The better option was not to race the beam at all. The interrupt routine merely reset the DMA pointer to the beginning of your graphics buffer for every frame and did not hang around any longer to repeat any scan lines. That left you with a weird resolution of 64 x 128 pixels and required a graphics buffer of 1024 bytes, but also left 71.63% of the CPU time for the actual program. So, which option would you choose? Memory is not as much of an issue as it used to be, but I think I can live with a weird resolution and take the performance gain. By the way, the same old processor, unrestricted by the old graphics chip, gives me more than 12 times the instructions per second compared to that worst case I have been using for 45 years now. And I have not even really tried to overclock it yet.

    I have lived with several Zen masters - all of the

    The Lounge performance css graphics game-dev help

  • Did anyone here ever race the electron beam? And did you win the race?
    C CodeWraith

    Light guns (or light pens) Were not very complicated. you could build them yourself with some cheap parts from Radio shack. All you basically needed was a photocell, a button and a toy gun to put these into. Many graphics chips simply had registers that told the current position of the electron beam. When the sensor in the gun detected the electron beam, it must be pointed at exactly these screen coordinates. No wild calculation of angles or anything like that. But this of course does not work when there is no electron beam to detect.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge graphics debugging tools performance question

  • Did anyone here ever race the electron beam? And did you win the race?
    C CodeWraith

    Interrupts - what a luxury. On my old box I do bit banged serial communication without a UART. Currently all is well at 19200 baud. 38400 works for single bytes, but not for larger memory blocks. The timing error in the delay loops obviously accumulates too much when too many bytes are sent at once. Maybe I can resynchronize at every start bit, but I would have to overclock the old 8 bit processor a little more. It could go faster than 8 MHz if I raised the processor's core voltage above 5V. Perhaps it would then even become noticably warm and the processor would actually require cooling.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge graphics debugging tools performance question

  • Did anyone here ever race the electron beam? And did you win the race?
    C CodeWraith

    The memory addressing logic was designed to let whatever graphics hardware you had access its video buffer quickly, not for the programmer's convenience. The hardware was racing the electron beam for you, so there was little time to waste. And then there is also the old problem of how to synchronize CPU and graphics hardware access to the same memory.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge graphics debugging tools performance question

  • Did anyone here ever race the electron beam? And did you win the race?
    C CodeWraith

    I have that book in the shelf myself. A shame that all that pedal to the metal assembly code has been replaced by hard wired logic in some graphics processor since then. But things have been that way all along. Turns out that even my first computer kindof raced the electron beam, but it was automated via interrupts and DMA so all I had to learn was how to set up an interrupt routine and had not to worry about how to put pixels on the screen anymore. The computer's design goes back to 1976. At that time having graphics at all were a complicated and expensive affair, so usually little single board computer kits did not have any at all. To make it not only affordable, but also relatively uncomplicated to use was a small wonder.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge graphics debugging tools performance question

  • Did anyone here ever race the electron beam? And did you win the race?
    C CodeWraith

    I built my first computer in 1978 and still have it. So you essentially built a light gun. I wonder, does shooting the (nonexistent) bea still work on modern monitors? do some game consoles even have light guns? It seems like I saw the last ones some time in the last millenium.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge graphics debugging tools performance question

  • Did anyone here ever race the electron beam? And did you win the race?
    C CodeWraith

    Even dual ported video memory did not solve this problem entirely. There still had to be a hardware mutual exclusion logic to prevent conflicts. Some graphics chips, like the Motorola MC6847 were so nice to have a signal that tells when it's accessing video RAM and when it's not. That practically was all you needed for mutual exclusion. Letting a graphics chip trigger an interrupt upon entering the vertical blank was another option. This would automatically also eliminate the problem of any other interrupts since you already were in an interrupt routine. But, of course, that also opens another can of worms anywhere where nested interrupts are allowed or things like non maskable interrupts are a thing.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge graphics debugging tools performance question

  • Did anyone here ever race the electron beam? And did you win the race?
    C CodeWraith

    Racing the beam is a way to put graphics on a screen without having any graphics memory. It was used in the 1970s when memory chips still had a tiny capacity and cost their weight in gold or more. The Atari VCS is a well known console that used this. It had only 128 bytes of RAM, the programs were on ROMs in the cartridges, so no room at all for any video buffer. It is called 'racing the beam' because most of the time the processor is busy staying ahead of the electron beam of the CRT monitor and putting the graphics data that will be displayed next directly into the registers of the graphics chips just in time. Be to quick or too slow and you have only garbage on the screen. And such luxuries as actual gameplay had to wait until the graphics chip was done with the current frame and entered the vertical blank period before starting with the next frame. Horrible fragile code and a nightmare to debug. Even proper debugging tools as we know them did not exist yet. But programmers who can deal with such old stuff are afraid of nothing.

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge graphics debugging tools performance question

  • I'd like the Peter Lorre version please.
    C CodeWraith

    I had replaced many windows sounds and visual studio sounds with simple voice samples of HAL or Star Trek's Nomad probe. How do you like HAL's 'My mind is going. I can feel it.' as Windows shutdown sound? Or Nomad's 'You are in error! You are a biological unit! You are inperfect!' on visual studio compilation errors?

    I have lived with several Zen masters - all of them were cats. His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.

    The Lounge question announcement
  • Login

  • Don't have an account? Register

  • Login or register to search.
  • First post
    Last post
0
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups