Unknown unknowns
-
I thought I was going to get in front of my graphics device drivers by starting with the SSD1306 - a small monochrome OLED display that comes in 3 different sizes/resolutions and runs over SPI, I2C, or (not usually) parallel - the latter of which is basically never available for MCUs - it's either SPI or I2C It's a simple device. I thought implementing it would be the easiest place to start. Now keep in mind, I've written this graphics library such that it should seamlessly interface with my device drivers with very little massaging on my part. I've architected it such that I can just lego what is there together with my device drivers. Or so I thought. This device has an integrated frame buffer, but you can't *read* the frame buffer over either of those serial interfaces - only the parallel one you never have access to. This isn't a show stopper normally, except it's a monochrome framebuffer, meaning 1-bit per pixel, meaning you have to read 8 pixels in order to write one pixel because you can only write a byte at a time. So basically, you can't really do random access to the frame buffer at all with this device because of the above. I had counted on not being able to read the buffer, but I hadn't counted on not being able to do random access when writing the frame buffer. That was a corner I just didn't think around. Software architecture was being discussed earlier, and I spent some time as a software architect professionally. What it taught me was no matter how good at it you are, the results are mixed, not consistent no matter how process oriented you are about it. The reason is because of things like the above. We can't predict the future. We can't think around every corner. Architecture comes with a certain amount of accepting the idea of moving 3 steps forward and two steps back when it comes to design. There are ways to mitigate this. You can be flexible and code compartmentalized such that if you rewrite part of it you don't have to rewrite all of it. I saved myself because I use "template polymorphism" instead of standard polymorphism and inheritance in my library, so I don't have a bunch of base class changes to make that will wreck my entire source tree, but it comes with downsides as well. But you will be rewriting code for any non-trivial project. You will be redesigning bits as you go. Your best efforts are not put in heading all of that off but rather putting it into making your code flexible enough that it will survive having portions of it ripped out and complet
-
I thought I was going to get in front of my graphics device drivers by starting with the SSD1306 - a small monochrome OLED display that comes in 3 different sizes/resolutions and runs over SPI, I2C, or (not usually) parallel - the latter of which is basically never available for MCUs - it's either SPI or I2C It's a simple device. I thought implementing it would be the easiest place to start. Now keep in mind, I've written this graphics library such that it should seamlessly interface with my device drivers with very little massaging on my part. I've architected it such that I can just lego what is there together with my device drivers. Or so I thought. This device has an integrated frame buffer, but you can't *read* the frame buffer over either of those serial interfaces - only the parallel one you never have access to. This isn't a show stopper normally, except it's a monochrome framebuffer, meaning 1-bit per pixel, meaning you have to read 8 pixels in order to write one pixel because you can only write a byte at a time. So basically, you can't really do random access to the frame buffer at all with this device because of the above. I had counted on not being able to read the buffer, but I hadn't counted on not being able to do random access when writing the frame buffer. That was a corner I just didn't think around. Software architecture was being discussed earlier, and I spent some time as a software architect professionally. What it taught me was no matter how good at it you are, the results are mixed, not consistent no matter how process oriented you are about it. The reason is because of things like the above. We can't predict the future. We can't think around every corner. Architecture comes with a certain amount of accepting the idea of moving 3 steps forward and two steps back when it comes to design. There are ways to mitigate this. You can be flexible and code compartmentalized such that if you rewrite part of it you don't have to rewrite all of it. I saved myself because I use "template polymorphism" instead of standard polymorphism and inheritance in my library, so I don't have a bunch of base class changes to make that will wreck my entire source tree, but it comes with downsides as well. But you will be rewriting code for any non-trivial project. You will be redesigning bits as you go. Your best efforts are not put in heading all of that off but rather putting it into making your code flexible enough that it will survive having portions of it ripped out and complet
honey the codewitch wrote:
What it taught me was no matter how good at it you are, the results are mixed, not consistent no matter how process oriented you are about it. The reason is because of things like the above. We can't predict the future. We can't think around every corner.
Amen ! I see a paradoxical inverse relationship between the extent to which i may have become secure and confident in my technical abilities ... and ... the extent to which I could enter a state of FUD facing a new project/challenge ... and ... the unleashing of creativity, and out-of-the-box thinking. Of course, I don't mean self-confidence in the broader sense: I speak of FUD that hones the blade of reason :) At the same time, creative "exaltation," as you have novel insights, carries the risk so perfectly embodied in the Narcissus story. I was lucky that my professional work usually gave me complete freedom to do "architect" whatever, because no one else understood my specialties (the PostScript language, color science, printing). These daze, I would never claim to have been an "architect" in the profound way I believe Marc Clifton is !
«One day it will have to be officially admitted that what we have christened reality is an even greater illusion than the world of dreams.» Salvador Dali
-
honey the codewitch wrote:
What it taught me was no matter how good at it you are, the results are mixed, not consistent no matter how process oriented you are about it. The reason is because of things like the above. We can't predict the future. We can't think around every corner.
Amen ! I see a paradoxical inverse relationship between the extent to which i may have become secure and confident in my technical abilities ... and ... the extent to which I could enter a state of FUD facing a new project/challenge ... and ... the unleashing of creativity, and out-of-the-box thinking. Of course, I don't mean self-confidence in the broader sense: I speak of FUD that hones the blade of reason :) At the same time, creative "exaltation," as you have novel insights, carries the risk so perfectly embodied in the Narcissus story. I was lucky that my professional work usually gave me complete freedom to do "architect" whatever, because no one else understood my specialties (the PostScript language, color science, printing). These daze, I would never claim to have been an "architect" in the profound way I believe Marc Clifton is !
«One day it will have to be officially admitted that what we have christened reality is an even greater illusion than the world of dreams.» Salvador Dali
I didn't so much claim the role as it was thrust upon me. In fact, I didn't want to go that direction, but I had been a senior developer for so long, headhunters were wondering about me. I had to move at least laterally if not upward, whether I wanted to or not in order to stay commercially viable.
Real programmers use butterflies
-
I thought I was going to get in front of my graphics device drivers by starting with the SSD1306 - a small monochrome OLED display that comes in 3 different sizes/resolutions and runs over SPI, I2C, or (not usually) parallel - the latter of which is basically never available for MCUs - it's either SPI or I2C It's a simple device. I thought implementing it would be the easiest place to start. Now keep in mind, I've written this graphics library such that it should seamlessly interface with my device drivers with very little massaging on my part. I've architected it such that I can just lego what is there together with my device drivers. Or so I thought. This device has an integrated frame buffer, but you can't *read* the frame buffer over either of those serial interfaces - only the parallel one you never have access to. This isn't a show stopper normally, except it's a monochrome framebuffer, meaning 1-bit per pixel, meaning you have to read 8 pixels in order to write one pixel because you can only write a byte at a time. So basically, you can't really do random access to the frame buffer at all with this device because of the above. I had counted on not being able to read the buffer, but I hadn't counted on not being able to do random access when writing the frame buffer. That was a corner I just didn't think around. Software architecture was being discussed earlier, and I spent some time as a software architect professionally. What it taught me was no matter how good at it you are, the results are mixed, not consistent no matter how process oriented you are about it. The reason is because of things like the above. We can't predict the future. We can't think around every corner. Architecture comes with a certain amount of accepting the idea of moving 3 steps forward and two steps back when it comes to design. There are ways to mitigate this. You can be flexible and code compartmentalized such that if you rewrite part of it you don't have to rewrite all of it. I saved myself because I use "template polymorphism" instead of standard polymorphism and inheritance in my library, so I don't have a bunch of base class changes to make that will wreck my entire source tree, but it comes with downsides as well. But you will be rewriting code for any non-trivial project. You will be redesigning bits as you go. Your best efforts are not put in heading all of that off but rather putting it into making your code flexible enough that it will survive having portions of it ripped out and complet
Thank your hubby. You impossible to please.
honey the codewitch wrote:
I saved myself because I use "template polymorphism" instead of standard polymorphism and inheritance in my library
Case closed.
Bastard Programmer from Hell :suss: "If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
-
I didn't so much claim the role as it was thrust upon me. In fact, I didn't want to go that direction, but I had been a senior developer for so long, headhunters were wondering about me. I had to move at least laterally if not upward, whether I wanted to or not in order to stay commercially viable.
Real programmers use butterflies
I'm certain you deserve the title ;)
«One day it will have to be officially admitted that what we have christened reality is an even greater illusion than the world of dreams.» Salvador Dali
-
I'm certain you deserve the title ;)
«One day it will have to be officially admitted that what we have christened reality is an even greater illusion than the world of dreams.» Salvador Dali
I didn't like doing it, so I'm sure there are others more suited. I am good at systems. I love systems. But politics, not so much, and I found that being an architect involves more interfacing with people I didn't care to interface with than I really liked. I'm a coder. Give me code. I like design, but I like to do it on my own, or among a small team I lead, where I don't have to deal with politics. Otherwise, I'd prefer to be strictly a coder.
Real programmers use butterflies
-
Thank your hubby. You impossible to please.
honey the codewitch wrote:
I saved myself because I use "template polymorphism" instead of standard polymorphism and inheritance in my library
Case closed.
Bastard Programmer from Hell :suss: "If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
Well, it has its drawbacks, like it's possible to put source code that won't compile into production if you do things this way. Code coverage testing becomes much more ... fun. It can also lead to code bloat, which is one of the main reasons that will cause me to fall back on binary polymorphism. There is no single silver bullet. I wish there was.
Real programmers use butterflies
-
Well, it has its drawbacks, like it's possible to put source code that won't compile into production if you do things this way. Code coverage testing becomes much more ... fun. It can also lead to code bloat, which is one of the main reasons that will cause me to fall back on binary polymorphism. There is no single silver bullet. I wish there was.
Real programmers use butterflies
-
Wow. I was not talking code.
Bastard Programmer from Hell :suss: "If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
I didn't see the hubby part. :laugh:
Real programmers use butterflies
-
I didn't see the hubby part. :laugh:
Real programmers use butterflies
-
I thought I was going to get in front of my graphics device drivers by starting with the SSD1306 - a small monochrome OLED display that comes in 3 different sizes/resolutions and runs over SPI, I2C, or (not usually) parallel - the latter of which is basically never available for MCUs - it's either SPI or I2C It's a simple device. I thought implementing it would be the easiest place to start. Now keep in mind, I've written this graphics library such that it should seamlessly interface with my device drivers with very little massaging on my part. I've architected it such that I can just lego what is there together with my device drivers. Or so I thought. This device has an integrated frame buffer, but you can't *read* the frame buffer over either of those serial interfaces - only the parallel one you never have access to. This isn't a show stopper normally, except it's a monochrome framebuffer, meaning 1-bit per pixel, meaning you have to read 8 pixels in order to write one pixel because you can only write a byte at a time. So basically, you can't really do random access to the frame buffer at all with this device because of the above. I had counted on not being able to read the buffer, but I hadn't counted on not being able to do random access when writing the frame buffer. That was a corner I just didn't think around. Software architecture was being discussed earlier, and I spent some time as a software architect professionally. What it taught me was no matter how good at it you are, the results are mixed, not consistent no matter how process oriented you are about it. The reason is because of things like the above. We can't predict the future. We can't think around every corner. Architecture comes with a certain amount of accepting the idea of moving 3 steps forward and two steps back when it comes to design. There are ways to mitigate this. You can be flexible and code compartmentalized such that if you rewrite part of it you don't have to rewrite all of it. I saved myself because I use "template polymorphism" instead of standard polymorphism and inheritance in my library, so I don't have a bunch of base class changes to make that will wreck my entire source tree, but it comes with downsides as well. But you will be rewriting code for any non-trivial project. You will be redesigning bits as you go. Your best efforts are not put in heading all of that off but rather putting it into making your code flexible enough that it will survive having portions of it ripped out and complet
-
How good are you in searching for generality can never beat the 'clever tricks' of the 'creative people' designing those little devices. :-D
"In testa che avete, Signor di Ceprano?" -- Rigoletto
I wish it was a clever trick. That I could at least understand. Frankly, it was either laziness or otherwise expediency in the name of cost savings that made them work this way. There's no technological reason for it other than .. less than stellar design.
Real programmers use butterflies
-
I wish it was a clever trick. That I could at least understand. Frankly, it was either laziness or otherwise expediency in the name of cost savings that made them work this way. There's no technological reason for it other than .. less than stellar design.
Real programmers use butterflies
-
honey the codewitch wrote:
expediency in the name of cost savings
Managers name that 'clever trick'.
"In testa che avete, Signor di Ceprano?" -- Rigoletto
Hahaha that's fair. And why I'm glad I'm not in management. :laugh:
Real programmers use butterflies