*cries in C++*
-
I love this language except when it's used cryptically. You can produce more incomprehensible code with C++ than I think you can in any other major language. I'm poring over C code right now - C really isn't that much better, but fortunately you can do less with it. The code is evil. It's absolutely terrible to read, almost as if they were *trying* to hide intent. Porting it to C++ is my fresh hell. I love this language, but would it kill people to write readable code, or at least comment it with something *helpful*? Anyway, I guess what I'm saying is C++ is both my favorite and least favorite language. It's weird like that.
Real programmers use butterflies
Quote:
I'm poring over C code right now - C really isn't that much better, but fortunately you can do less with it. The code is evil. It's absolutely terrible to read, almost as if they were *trying* to hide intent.
How do you hide intent in C? No overloaded operators, no overloaded functions, no implicit calls to constructors that need explicit calls to destructors, no symbols with identical names in different namespaces ... The amount of "magic happens implicitly behind the scenes" things in C is ridiculously small.
-
Quote:
I'm poring over C code right now - C really isn't that much better, but fortunately you can do less with it. The code is evil. It's absolutely terrible to read, almost as if they were *trying* to hide intent.
How do you hide intent in C? No overloaded operators, no overloaded functions, no implicit calls to constructors that need explicit calls to destructors, no symbols with identical names in different namespaces ... The amount of "magic happens implicitly behind the scenes" things in C is ridiculously small.
By making your code do something that is non-obvious.
Real programmers use butterflies
-
C gets you rather close to the machine, but it keeps you there. Meaning there isn't really a way in C NOT to hide intent as you'd be rather busy spelling out mechanics of the "how" explicitly, burying the intent. I very much agree with you on C++ making it way easier to spell out the intent, letting the library do the how, or at least abstracting it away.
The one line C contest is holding on line 1. It would like a word.
Real programmers use butterflies
-
TECO was the same way. TECO was used to write the first version of Emacs and is a string processing language. One of the challenges TECO coders would do is write a one liner and challenge their counterparts to write the result of putting their name in the function.
I remember TECO - had a port of it as my first desktop computer editor! Took some learning, but boy was it powerful when all you had was a line editor...
-
I love this language except when it's used cryptically. You can produce more incomprehensible code with C++ than I think you can in any other major language. I'm poring over C code right now - C really isn't that much better, but fortunately you can do less with it. The code is evil. It's absolutely terrible to read, almost as if they were *trying* to hide intent. Porting it to C++ is my fresh hell. I love this language, but would it kill people to write readable code, or at least comment it with something *helpful*? Anyway, I guess what I'm saying is C++ is both my favorite and least favorite language. It's weird like that.
Real programmers use butterflies
-
honey the codewitch wrote:
You can produce more incomprehensible code with C++ than I think you can in any other major language
Hahahahaha ... not even close. Work this out:
⎕←(~A∊A∘.×A)/A←1↓⍳N
or this:
life ← {⊃1 ⍵ ∨.∧ 3 4 = +/ +⌿ ¯1 0 1 ∘.⊖ ¯1 0 1 ⌽¨ ⊂⍵}
C++ can't even come close to APL for code density or incomprehensibility! :laugh: The first one is the Sieve of Eratosthenes, the second is the Game of Life.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt AntiTwitter: @DalekDave is now a follower!
APL was my first language; in high school, then in university co-op work. Yes, it's dense, and uses symbols you don't see on a regular kbd. But it really changes how you think, for the good. Functional programming (!) Expressions on data collections, rather than rat-holing on iterators. It was great for analytics of the first (StatsCan) time-series database. After APL, I worked in C, ZOPL, PL/I, Algol, POP2, VB, perl, C++ and more Every one added new bits for understanding the next one down the line; some of them on what to avoid (I'm looking at *you*, C++20). But APL was the strongest and cleanest. Now JPMorgan uses it (well K), because it is superfast for solving complex problems, and surprisingly straightforward. Everyone has fun with the primes and GameOfLife oneliners :0) They don't get to see the full applications. Sigh. My2¢
-
I love this language except when it's used cryptically. You can produce more incomprehensible code with C++ than I think you can in any other major language. I'm poring over C code right now - C really isn't that much better, but fortunately you can do less with it. The code is evil. It's absolutely terrible to read, almost as if they were *trying* to hide intent. Porting it to C++ is my fresh hell. I love this language, but would it kill people to write readable code, or at least comment it with something *helpful*? Anyway, I guess what I'm saying is C++ is both my favorite and least favorite language. It's weird like that.
Real programmers use butterflies
Well I think all midway languages (which to me is any compiled language between FORTRAN's readability and Assembler) is that readability will always be second to performance...although that should mean that a proportional amount of comments should be added to the code (at least a full page for the dark arts). Some of the coolest code I've seen uses pointer arithmetic like hell to bypass C++ permissions (this was a videogame engine) and the only comment it had was:
/* Do not change this code or ponies will cry */
I had been tracing the source code through 7 different files when I was rewarded with a hearty laugh there. I don't know how readable it was to any other person, but since then, I knew that game engines, compilers and O.S. source code is never going to be readily accessible to just anyone (Despite our best comments). That is the nature of language, human, computer or otherwise.
-
Well I think all midway languages (which to me is any compiled language between FORTRAN's readability and Assembler) is that readability will always be second to performance...although that should mean that a proportional amount of comments should be added to the code (at least a full page for the dark arts). Some of the coolest code I've seen uses pointer arithmetic like hell to bypass C++ permissions (this was a videogame engine) and the only comment it had was:
/* Do not change this code or ponies will cry */
I had been tracing the source code through 7 different files when I was rewarded with a hearty laugh there. I don't know how readable it was to any other person, but since then, I knew that game engines, compilers and O.S. source code is never going to be readily accessible to just anyone (Despite our best comments). That is the nature of language, human, computer or otherwise.
Breaking encapsulation by offsetting from a classes base address seems like bad form, even for a game engine. That's what the "friend" keyword is for. :) Of course I have some nasty nasty code in my SPI bus code because it interacts with the ESP32's SPI hardware registers directly for performance. It hurts.
Real programmers use butterflies
-
Breaking encapsulation by offsetting from a classes base address seems like bad form, even for a game engine. That's what the "friend" keyword is for. :) Of course I have some nasty nasty code in my SPI bus code because it interacts with the ESP32's SPI hardware registers directly for performance. It hurts.
Real programmers use butterflies
Lol For the time it was written it was most certainly a hack. That portion was still in C++ 98 and the pointer redirection was for a scripting language internal to the engine (to directly call any method from any class in the engine) SPI bus code...you reminded me of assembler code I found in our main Nintendo DSi Engine (we called coldbits) to directly write to the buffer of the image processor. It was not child's play but I don't think the use of the keyword asm in C++ is hackish (more like "Here be dragons" kinda warning)
-
By making your code do something that is non-obvious.
Real programmers use butterflies
but to be honest, that can be done in any language. I'm reading through "learning python" and just hit the description of formatting strings. I don't know what that guy was smoking when they came up with their approach, but I want some.
Charlie Gilley “They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759 Has never been more appropriate.
-
Breaking encapsulation by offsetting from a classes base address seems like bad form, even for a game engine. That's what the "friend" keyword is for. :) Of course I have some nasty nasty code in my SPI bus code because it interacts with the ESP32's SPI hardware registers directly for performance. It hurts.
Real programmers use butterflies
After having lived with Microsoft's version of C++, and the C++ source code of applications written by C developers, I have come to the point that I like encapsulation but inheritance is a multi headed hydra and usually not worth the effort. My experience has been that if you go past one level of inheritance, you are elephanting doomed. Reading about objects and what not sounds nice, but when you get your nice inheritance hierarchy set up, in maintenance you realize that you have an inverted pyramid. Touch one base class, and it all falls down. Now I would agree with you - if you are sharing pointers amongst objects, you have design issues.
Charlie Gilley “They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759 Has never been more appropriate.
-
After having lived with Microsoft's version of C++, and the C++ source code of applications written by C developers, I have come to the point that I like encapsulation but inheritance is a multi headed hydra and usually not worth the effort. My experience has been that if you go past one level of inheritance, you are elephanting doomed. Reading about objects and what not sounds nice, but when you get your nice inheritance hierarchy set up, in maintenance you realize that you have an inverted pyramid. Touch one base class, and it all falls down. Now I would agree with you - if you are sharing pointers amongst objects, you have design issues.
Charlie Gilley “They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759 Has never been more appropriate.
These days I don't use inheritance that much except for doing compile-time computation tricks. I usually use templates rather than have strict base classes. In GFX I expose a "caps" template structure that indicates the capabilities of an object. GFX calls certain methods on that object based on the values in that template structure, so for example, it can determine if the object supports reading, or certain optimized operations. Normally, you'd inherit from a base class to do this, but you can do similar using templates. Code size can get to be in issue depending on what you're doing though, but I like the flexibility.
Real programmers use butterflies
-
Lol For the time it was written it was most certainly a hack. That portion was still in C++ 98 and the pointer redirection was for a scripting language internal to the engine (to directly call any method from any class in the engine) SPI bus code...you reminded me of assembler code I found in our main Nintendo DSi Engine (we called coldbits) to directly write to the buffer of the image processor. It was not child's play but I don't think the use of the keyword asm in C++ is hackish (more like "Here be dragons" kinda warning)
The reason I don't like the asm keyword is I like my C++ code to be portable. I target a lot of IoT, and those devices come in a variety of architectures, even among the same lines of chips. I've studied the output of GCC in many cases to where I can structure my C++ code to generate the asm I want while maintaining its higher level structure. Of course that doesn't work if you need to set specific registers and such. Still, I use a lot of compile-time tricks to achieve my goals in the real world, but the result tends to be a best of both worlds scenario - you get efficient, highly structured code.
Real programmers use butterflies
-
These days I don't use inheritance that much except for doing compile-time computation tricks. I usually use templates rather than have strict base classes. In GFX I expose a "caps" template structure that indicates the capabilities of an object. GFX calls certain methods on that object based on the values in that template structure, so for example, it can determine if the object supports reading, or certain optimized operations. Normally, you'd inherit from a base class to do this, but you can do similar using templates. Code size can get to be in issue depending on what you're doing though, but I like the flexibility.
Real programmers use butterflies
Inheritance for me mainly works best in a very contained environment. I suppose if I were responsible for a core set of code that would apply to multiple applications (like a GUI control set), it might make sense. My career has been spent developing one application after another, and rarely do they inherit from each other. Maybe basic concepts, but as soon as some other team member does not understand something, they code up their own solution and off we go. Inheritance broken. Templates - well there be magic, but in fact, yet again, the ivory tower folks seem to come up with a pristine solution, and the folks that are shoveling $^&&^& in Dixie code something up they understand to get the job done.
Charlie Gilley “They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759 Has never been more appropriate.
-
I love this language except when it's used cryptically. You can produce more incomprehensible code with C++ than I think you can in any other major language. I'm poring over C code right now - C really isn't that much better, but fortunately you can do less with it. The code is evil. It's absolutely terrible to read, almost as if they were *trying* to hide intent. Porting it to C++ is my fresh hell. I love this language, but would it kill people to write readable code, or at least comment it with something *helpful*? Anyway, I guess what I'm saying is C++ is both my favorite and least favorite language. It's weird like that.
Real programmers use butterflies
honey the codewitch wrote:
but would it kill people to write readable code, or at least comment it with something *helpful*?
In my experience and based on the cries of anguish, gnashing of teething and pulling out of hair when I even suggest that comments have a place in code I am guessing that the answer is that yes it would kill them.
-
honey the codewitch wrote:
but would it kill people to write readable code, or at least comment it with something *helpful*?
In my experience and based on the cries of anguish, gnashing of teething and pulling out of hair when I even suggest that comments have a place in code I am guessing that the answer is that yes it would kill them.
Comments *do* have a place in code - as a last resort. I mean, it's one thing to comment the description of functions and arguments as a header to the function, but i mean the code inside of it should have no more comments than are necessary, and each comment ultimately represents a failure to express intent clearly in code. Sometimes comments are necessary because code can be evil. But good code expresses intent clearly when it can, obviating the need for a comment. At least that's my take on it. Code should be self-evident wherever possible.
Real programmers use butterflies
-
but to be honest, that can be done in any language. I'm reading through "learning python" and just hit the description of formatting strings. I don't know what that guy was smoking when they came up with their approach, but I want some.
Charlie Gilley “They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759 Has never been more appropriate.
Yeah it can be done in any language it just seems like some languages make it easier to write incomprehensible code than others.
Real programmers use butterflies
-
APL was my first language; in high school, then in university co-op work. Yes, it's dense, and uses symbols you don't see on a regular kbd. But it really changes how you think, for the good. Functional programming (!) Expressions on data collections, rather than rat-holing on iterators. It was great for analytics of the first (StatsCan) time-series database. After APL, I worked in C, ZOPL, PL/I, Algol, POP2, VB, perl, C++ and more Every one added new bits for understanding the next one down the line; some of them on what to avoid (I'm looking at *you*, C++20). But APL was the strongest and cleanest. Now JPMorgan uses it (well K), because it is superfast for solving complex problems, and surprisingly straightforward. Everyone has fun with the primes and GameOfLife oneliners :0) They don't get to see the full applications. Sigh. My2¢
mischasan wrote:
because it is superfast for solving complex problems, and surprisingly straightforward.
Right up until someone asks for the studies that demonstrates those statements using something besides benchmarks (that are often coded well in one language and then so poorly done in other languages that one suspects that they were written that way deliberately.) But to be fair I am certain I have seen that claim about every language and based, when based on anything at all, on the same sort of lopsided benchmarks.
-
C++ is my favorite language because it does not force you to do anything. That does include making it readable, which is the original author's fault not the language. Languages that force you to make your code readable will inevitably loose some (potentially very useful) features in order to make that happen, like #define for example. I admit I am guilty of intentionally making code less readable, only because I am forced to run it through a painfully awful "security" code scanner. The program is a web api in c# and the only way we can take any kind of data from the db and return it to the caller without the scanner whining is to store the data in a dictionary (dynamic) and then retrieve it back again, so that's been wrapped up in a pair of methods: return obfuscator.get(obfuscator.insert(db.runProc("ProcName", args, or, whatever)))
-
Inheritance for me mainly works best in a very contained environment. I suppose if I were responsible for a core set of code that would apply to multiple applications (like a GUI control set), it might make sense. My career has been spent developing one application after another, and rarely do they inherit from each other. Maybe basic concepts, but as soon as some other team member does not understand something, they code up their own solution and off we go. Inheritance broken. Templates - well there be magic, but in fact, yet again, the ivory tower folks seem to come up with a pristine solution, and the folks that are shoveling $^&&^& in Dixie code something up they understand to get the job done.
Charlie Gilley “They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759 Has never been more appropriate.
I just love templates, personally. I'm not as big a fan of the STL, but I don't have much occasion to use it since I often target the Arduino platform with my code, and its STL implementation is only partial on some of its targets. Part of the joy of coding C++ for me is seeing what I can schlep from runtime to compile time in order to increase performance and maintain flexibility. I have a pixel template class that allows you to declare individual color channels and the bit depth of each, and compose a pixel with as many channels as you want, up to the machine's word size. It will then allow you to modify the individual channels so you can set the red channel of an RGB pixel, or the U channel of a Y'UV pixel, and it will recompute the overall value. If you use constants it will compute all of it at compile time, including getting the compiler to bit shift arbitrary bits an arbitrary direction.
Real programmers use butterflies