Wicked code!
-
This my friends, is really neat. With my GFX library:
template
using bgrx_pixel = gfx::pixel<
gfx::channel_traits,
gfx::channel_traits,
gfx::channel_traits,
gfx::channel_traits;
You can then do
auto col = color>::purple
and get a pixel back in the format of 0xBBGGRRFF (where RR GG and BB are each a byte and represent the red, green, and blue color channels, and FF is just a NOP channel - unused - and set to 0xFF). Purple would be like 0xFF00FFFF. With this you can create bitmaps (bitmap>
)in this format and feed them right to DirectX, which happily eats bitmaps with this pixel footprint. The template code to make this work is pretty crazy. The fact that it's pretty easy to declare new pixel formats made it possible for me to port my code from IoT to DirectX on a PC so I can rapidly prototype without having to upload to a device each time. I've never really used this feature in the wild. My color format is usually either 16-bit color, monochrome or grayscale. It was confusing at first though because originally my nop channel was an alpha channel, and so it wasn't rendering (black screen) because DirectX is set to ignore that channel, and my GFX library assumed it would be respected. Anyway, after I overcame that the rest was easy. Same code of mine works in Arduino, ESP-IDF, probably Zephyr, and now the PC. Say what you want about C and C++ they truly run everywhere.Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
-
This my friends, is really neat. With my GFX library:
template
using bgrx_pixel = gfx::pixel<
gfx::channel_traits,
gfx::channel_traits,
gfx::channel_traits,
gfx::channel_traits;
You can then do
auto col = color>::purple
and get a pixel back in the format of 0xBBGGRRFF (where RR GG and BB are each a byte and represent the red, green, and blue color channels, and FF is just a NOP channel - unused - and set to 0xFF). Purple would be like 0xFF00FFFF. With this you can create bitmaps (bitmap>
)in this format and feed them right to DirectX, which happily eats bitmaps with this pixel footprint. The template code to make this work is pretty crazy. The fact that it's pretty easy to declare new pixel formats made it possible for me to port my code from IoT to DirectX on a PC so I can rapidly prototype without having to upload to a device each time. I've never really used this feature in the wild. My color format is usually either 16-bit color, monochrome or grayscale. It was confusing at first though because originally my nop channel was an alpha channel, and so it wasn't rendering (black screen) because DirectX is set to ignore that channel, and my GFX library assumed it would be respected. Anyway, after I overcame that the rest was easy. Same code of mine works in Arduino, ESP-IDF, probably Zephyr, and now the PC. Say what you want about C and C++ they truly run everywhere.Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
Congratulations! That is worthy of a celebration, however you celebrate! Someday maybe I'll have time to master such code as well - it sounds fun!
Our Forgotten Astronomy | Object Oriented Programming with C++ | Wordle solver
-
This my friends, is really neat. With my GFX library:
template
using bgrx_pixel = gfx::pixel<
gfx::channel_traits,
gfx::channel_traits,
gfx::channel_traits,
gfx::channel_traits;
You can then do
auto col = color>::purple
and get a pixel back in the format of 0xBBGGRRFF (where RR GG and BB are each a byte and represent the red, green, and blue color channels, and FF is just a NOP channel - unused - and set to 0xFF). Purple would be like 0xFF00FFFF. With this you can create bitmaps (bitmap>
)in this format and feed them right to DirectX, which happily eats bitmaps with this pixel footprint. The template code to make this work is pretty crazy. The fact that it's pretty easy to declare new pixel formats made it possible for me to port my code from IoT to DirectX on a PC so I can rapidly prototype without having to upload to a device each time. I've never really used this feature in the wild. My color format is usually either 16-bit color, monochrome or grayscale. It was confusing at first though because originally my nop channel was an alpha channel, and so it wasn't rendering (black screen) because DirectX is set to ignore that channel, and my GFX library assumed it would be respected. Anyway, after I overcame that the rest was easy. Same code of mine works in Arduino, ESP-IDF, probably Zephyr, and now the PC. Say what you want about C and C++ they truly run everywhere.Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
-
What I find really amazing is that the compilers can actually deal with such source. You can probably guess that i have never written a compiler.
-
This my friends, is really neat. With my GFX library:
template
using bgrx_pixel = gfx::pixel<
gfx::channel_traits,
gfx::channel_traits,
gfx::channel_traits,
gfx::channel_traits;
You can then do
auto col = color>::purple
and get a pixel back in the format of 0xBBGGRRFF (where RR GG and BB are each a byte and represent the red, green, and blue color channels, and FF is just a NOP channel - unused - and set to 0xFF). Purple would be like 0xFF00FFFF. With this you can create bitmaps (bitmap>
)in this format and feed them right to DirectX, which happily eats bitmaps with this pixel footprint. The template code to make this work is pretty crazy. The fact that it's pretty easy to declare new pixel formats made it possible for me to port my code from IoT to DirectX on a PC so I can rapidly prototype without having to upload to a device each time. I've never really used this feature in the wild. My color format is usually either 16-bit color, monochrome or grayscale. It was confusing at first though because originally my nop channel was an alpha channel, and so it wasn't rendering (black screen) because DirectX is set to ignore that channel, and my GFX library assumed it would be respected. Anyway, after I overcame that the rest was easy. Same code of mine works in Arduino, ESP-IDF, probably Zephyr, and now the PC. Say what you want about C and C++ they truly run everywhere.Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
-
I'm wondering how it will work in 32/64? I mean at compile time, where you don't know what the target machine runs on.
I have some information about the target machine like endianness and word size. If I don't support the platform you can fill in the information with -D defines.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
-
I'm wondering how it will work in 32/64? I mean at compile time, where you don't know what the target machine runs on.
I should add, this routinely runs on 32-bit machines. This code is now running on a 64-bit machine.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
-
What I find really amazing is that the compilers can actually deal with such source. You can probably guess that i have never written a compiler.
C++ compilers are something else, even at that. Pure magic.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
-
I'm wondering how it will work in 32/64? I mean at compile time, where you don't know what the target machine runs on.
Does 32 not run in 64? The other way around I know that is not possible / so easy, but I thought 32 would not be a big deal in 64
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
-
Does 32 not run in 64? The other way around I know that is not possible / so easy, but I thought 32 would not be a big deal in 64
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
You're thinking of the x86 platform. That might not apply to microcontrollers or SBCs.
The difficult we do right away... ...the impossible takes slightly longer.
-
You're thinking of the x86 platform. That might not apply to microcontrollers or SBCs.
The difficult we do right away... ...the impossible takes slightly longer.
true :-O
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
-
C++ compilers are something else, even at that. Pure magic.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix