They do it because if work were easy it wouldn't be work, would it? *hides*
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
They do it because if work were easy it wouldn't be work, would it? *hides*
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
I've been very entertained by this: For those of you that aren't halfway in the ground yet (you kids will all die in The Water Wars) the EGA Graphics Adapter was a circa 1987 display adapter for PCs with a 16-color palette selectable from 64 possible colors. My graphics library allows you to deal with input and output data in any pixel format you want, even EGA. I've been rendering SVGs to EGA as a sort of anachronistic attempt at avoiding responsibility and it is very gratifying somehow. Makes me want to listen to some Bengals or something. AMD Ryzen Logo 32-bit color[^] AMD Ryzen Logo EGA 4-bit palette[^] Tiger 32-bit color[^] Tiger EGA 4-bit palette[^] What's surprising is how faithfully it represents the images even though I'm not adjusting the palette from the default 16 colors (EGA has 64, which i can use) It's so easy to to mess with EGA I've got this mess:
// map an EGA palette to RGBA32
using pal_t = ega_palette,false>;
pal_t pal;
using pixel_t = //vector_pixel; // ARGB32
//rgb_pixel<24>; // (callbacks, not direct) no alpha WORKS
//rgb_pixel<16>; // (direct RGB16) WORKS
//rgba_pixel<32>; // (direct RGBA32) WORKS
//rgb_pixel<18>; // unaligned (use callbacks, not direc
I've been very entertained by this: For those of you that aren't halfway in the ground yet (you kids will all die in The Water Wars) the EGA Graphics Adapter was a circa 1987 display adapter for PCs with a 16-color palette selectable from 64 possible colors. My graphics library allows you to deal with input and output data in any pixel format you want, even EGA. I've been rendering SVGs to EGA as a sort of anachronistic attempt at avoiding responsibility and it is very gratifying somehow. Makes me want to listen to some Bengals or something. AMD Ryzen Logo 32-bit color[^] AMD Ryzen Logo EGA 4-bit palette[^] Tiger 32-bit color[^] Tiger EGA 4-bit palette[^] What's surprising is how faithfully it represents the images even though I'm not adjusting the palette from the default 16 colors (EGA has 64, which i can use) It's so easy to to mess with EGA I've got this mess:
// map an EGA palette to RGBA32
using pal_t = ega_palette,false>;
pal_t pal;
using pixel_t = //vector_pixel; // ARGB32
//rgb_pixel<24>; // (callbacks, not direct) no alpha WORKS
//rgb_pixel<16>; // (direct RGB16) WORKS
//rgba_pixel<32>; // (direct RGBA32) WORKS
//rgb_pixel<18>; // unaligned (use callbacks, not direc
This happens to me all the time, and then I curse the person that decided copy and paste shortcuts needed to be right next to each other.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
I've always had that problem here periodically. It usually rights itself after an hour or so though.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
It's usually not a matter of "can I do something?" (the answer is almost always yes, but you won't like the details), rather "how the hell do I accomplish this?" I'm currently running into one such issue trying to avoid template creep, and I thought I had it, but no. I missed a detail. Which I think I just solved while typing this. Edit: maybe. hmmm. Such is the nature of this language. It's a giant puzzlebox. With C#, as often as not it's a matter of learning APIs. With C++ it's usually a matter of learning to cajole the compiler to bend your source code the way you want it to. It's uniquely challenging, not because of pointers or the usual footguns that people tend to complain about with the C family mid level languages, but because of template
and all of the dynamics and paradigms that introduces into the language. It's really kind of amazing what you can do with it.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
Phil J Pearson wrote:
Also, it never seems to take so long or be such hard work the second time.
You are so right about that! I just finished retooling and testing. It's all using my new span API. Normally, I am very gifted at designing by the seat of my pants (while coding). For example my graphics library lasted years without a significant breaking change, especially to the design. I added a ton of features in that time. It's a combination of almost 4 decades of coding + some latent ability. But maybe that's why I get frustrated with myself when I miss the mark.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
I spent several days making my vector canvas able to "direct bind to a 'draw target'" if it happened to be a bitmap with a supported pixel format. I did this by extracting the pointer to the bitmap and then doing the standard (y*stride) + x*(stride/width)
on it to get my final pointer. That sucks. For starters it only works with hard bitmaps. I can do better. Enter gfx_span
which is a little structure with a pointer and a length. You can then do span(location).data
and/or span(location).length
off a bitmap to get a pointer and a length for the remainder of that row. This is important, because it opens up the blt capability (direct read/write) to more than just bitmaps. For example, my UIX library the control surface draw target does a translation and clip before writing to the backing bitmap. Without span()
I cannot get a raw pointer to the backing bitmap data. I must use methods off the draw target like point()
and fill()
which is generally much slower - all to do that translation and clip. Unfortunately the existing code I've worked on for days will not survive this change. I have a lot of work in front of me, all because this span paradigm didn't occur to me on like, Monday. :mad: Curse my brain.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
Also looking at that code, it doesn't do what the website does at all. It has no conception of UTF encoding. Pro-tip: (This one's free) - Don't ask GPT for help with code.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
It doesn't, because I know how to do that. It's more effort to build that, or even to just launch a terminal window in order to run that vs navigate to that website and type in a string. Was actually using it to test UTF-8 -> UTF-32 conversion in my graphics lib - that does NOT use the C++ runtimes or The STL because embedded.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
I do find it pretty repulsive.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
:~ X|
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
Mike Hankey wrote:
I read somewhere, I don't remember if for dogs or cats that you put a balloon where you don't want them to go and when they pop it they never go there again.
It sounds like a cat thing, and in most cases would probably be very effective. However, I have an orange cat. Therein lies the problem. She is as fearless (rides in cars happily) and mischievous as she is stupid. The balloon thing may end up exciting her and totally backfire on me.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
I trained my cat to stop piddling in the bed. I intend to train her to stop getting into the spots in the bedroom where kitties are not supposed to be. As a rule I make it a point to do six impossible things before breakfast.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
I should note that this isn't exactly related to Richard's overarching project, but it made me think of all this, given his specific problem was one that very much seemed a data entry problem. This almost deserves an article or something, but I'm just not up to writing it, so I'll leave it as a very general, code free set of observations from directly working with people who needed to use my software to enter data. When I was building little apps for local small business (a taxi company) or at least local chapters of larger organizations (Boys and Girls Club) the first thing I always did was job shadow the person doing the job my software was either going to streamline or replace. I didn't write a line of code until I could do the job as well, if not as fast, as the operator doing the tasks my software targeted. Do this. For the love of everything that is right, do this. I can't stress enough how much better your product will be. As often as not you'll come up with something that breaks all kinds of UI guidelines you were taught, but provides a far better DIRECT EXPERIENCE in the context of that operator's workflow than Microsoft ever could have thought of when writing their style guides. It's priceless. If you can't do that for whatever reason, then at least get regular feedback from someone who is dogfooding your code (you're dogfooding during development, right?), but that's still not anything like learning the tasks yourself. From this experience, a few stupid observations: People that enter data like keyboards. Moving your fingers to the mouse means moving them off the home row. Every time you do that, it's productivity that could have been stored for making a sandwich You can type anything you can click, short of painting a picture. So the rest is just, what they type and what you accept. That's where the effort should really go - gentle enforcement of business rules, and make your software work like a good employee - if it has a problem, it also suggests a solution. something that can be quickly keypressed past. Autocomplete is your friend, as long as it's unobtrustive. Entry history for fields is usually worth its weight in gold. "BUT THAT'S SO 1990S!" you may scream. "What about all these fancy web style user interfaces that are all the rage these days?" No. There's a time and place for that - usually on a phone - for working people, your interface is work boots, not heels. AND FINALLY USE YOUR OWN STUFF. I stand by this stuff, and if I'm wrong about it, well being w
I'm going to commission a camera crew to start following open source projects and pitch the footage to TLC for a new reality TV program. I'll be rich.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
Hey now, I don't necessarily give into every errant urge that comes on. :laugh: True story, my first Halloween at Microsoft I showed up and thought I wound up at a Star Trek convention. They'd improve developer productivity dramatically if they'd localize Visual Studio to Klingon.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
I couldn't find the option to actually disable it in my bios settings. And yeah, I tried advanced mode. It lets me reorder them, and even SMART scan them, but disabling is apparently a league too far.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
This is what happens when you give heavy equipment to men that never quite grew up. :laugh:
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix
I have the strangest urge to give you a wedgie and stuff you in a locker after reading that.
Check out my IoT graphics library here: https://honeythecodewitch.com/gfx And my IoT UI/User Experience library here: https://honeythecodewitch.com/uix