OO is not all that and a bag of chips
-
Disclaimer: Unpopular opinion A lot of coders spend a lot of lines of code dividing things into tiny steps which they then make whole classes for and abstract everything to the Nth degree, often even when the abstraction is not helpful. Back when I was a green coder, I used to write OO code somewhat like this. Then C++ changed me. I stopped relying on objects so much. This bled over into other languages. Now my code is about expedience. For example, I created a little HTTP server that does the request/response cycle in a single method, with two support structs instead of a dozen classes. My code is smaller, faster, easy enough to understand if you aren't a beginner and overall better for it. It's getting to the point where I think OO is an ill conceived paradigm - and not even because it's Broken As Designed (it's not) but because it gets way overused to the point where the dev world may have been better off with something else.
Real programmers use butterflies
Disclaimer: The Big Brother is watching you! There was a time when at least you could lose your job for such claims, at worst you could have get killed by an angry mob of mostly rookie developers who want to show of. I remember how much impressed I was with multiple inheritance, assignment overloading and copy constructors... One day I realized what I have always known as a kid. Programming is data processing. "in C++ as in Simula a class is a user defined type." "Every language that uses the word class, for type, is a descendent of Simula" Bjarne Stroustrup They should have called OOP - class oriented developing, because it's appealing to class obsessed chauvinists. Contrary to popular belief, objects are only data. You could have a pointer to an array of pointers to functions here and there or a reference to a function, but that's data too. No matter what language you use it all gets down to the same assembly language. Even before that, in the compilation process, programs are translated to a common language neutral data representation. So, for EVERY program in Java you could write a program in C that gets translated into the same assembly code the CPU will execute. But, you could hardly write a Java program for ANY C program that will be translated into the same assembly code. "The very first Java compiler was developed by Sun Microsystems and was written in C using some libraries from C++. Today, the Java compiler is written in Java, while the JRE is written in C." "The Sun JVM is written in C" Provided as is from Stackoverflow. C implements Java, but Java cannot implement C. Back to topic, this is what I find most appealing. "We don’t have a mathematical model for OOP. We have Turing machines for imperative (procedural) programming, lambda-calculus for functional programming and even pi-calculus (and CSP by C.A.R. Hoare again and other variations) for event-based and distributed programming, but nothing for OOP. So the question of “what is a ‘correct’ OO program?”, cannot even be defined; (much less, the answer to that question.)" It was given as an answer at Quora to the question 'Why did Dijkstra say that “Object-oriented programming is an exceptionally bad idea which could only have originated in California.?"' Greetings.
-
Disclaimer: Unpopular opinion A lot of coders spend a lot of lines of code dividing things into tiny steps which they then make whole classes for and abstract everything to the Nth degree, often even when the abstraction is not helpful. Back when I was a green coder, I used to write OO code somewhat like this. Then C++ changed me. I stopped relying on objects so much. This bled over into other languages. Now my code is about expedience. For example, I created a little HTTP server that does the request/response cycle in a single method, with two support structs instead of a dozen classes. My code is smaller, faster, easy enough to understand if you aren't a beginner and overall better for it. It's getting to the point where I think OO is an ill conceived paradigm - and not even because it's Broken As Designed (it's not) but because it gets way overused to the point where the dev world may have been better off with something else.
Real programmers use butterflies
The problem isn't OO; slavish fanatical adherence to anything at all screws everything up -- and it's certainly non-evolutionary. Doing things one way and one way only results in restrictions to growth and expansion. Given the above immutable fact, rigid adherence to OO practices is obviously wrong before even going into details, so I won't waste my time going into any (plus I don't have a week to spare).
I wanna be a eunuchs developer! Pass me a bread knife!
-
I find it interesting that you now use objects less than when you first started to code. I would've thought that it would work the other way around. But maybe that's for us dinosaurs who first learned structured programming and later had to think in terms of objects. If you learn objects first, I can see it progressing in the opposite direction. I'm curious as to what you meant by C++ changing your attitude towards objects. Maybe you started to use them less because C++ has too much boilerplate! Sure, a standalone piece of code will be smaller, faster, and easier to understand if it isn't broken up into many little objects. But you called it a "little HTTP server". What if it had to be big? Or support other protocols? Or be integrated with a large system? The larger the system, the more important it is to achieve reuse and abstraction, which means separating concerns and using polymorphism, inheritance, and encapsulation. Without this, developers clone and tweak code that can't be reused because it's admixed with other concerns. It also becomes harder and harder to add new capabilities, because they have to interwork with components that exhibit superfluous diversity. A maze of twisty little passages, all different. That said, OO can get overused and won't solve everything. It would be great if it could be coupled with aspect-oriented programming, but I haven't seen a good way to do that, and aspects may simply be intractable when it comes to cleanly separating concerns.
Robust Services Core | Software Techniques for Lemmings | Articles
> I'm curious as to what you meant by C++ changing your attitude towards objects. Maybe he meant that C++ is multiparadigm programming language as opposed to Java which selling point was that it's true and only OOP? "I actually never said that [C++ is an object oriented language]. It's always quoted, but I never did. I said that C++ supports object oriented programming and other techniques." Bjarne Stroustrup, Artificial intelligence Podcast 42:20 > What if it had to be big? Do it in OOP and you will make it big. "Once you reach a particular size, anything beyond that is no longer a reflection of functionality." Kevlin Henney, GOTO 2016 • Small Is Beautiful 55:40 The Facebook iOS app has over 18000 classes. How do you compare it to Quake 3 that can render 30FPS of a 3D world on Pentium 3? My guess is that the Quake team could have developed that fb app with 5 or no classes in 10x less time, the app being 10x less buggy and working at 10x speed vs the current iOS fb app. At the same time the iOS FB team could hardly construct a PAC-MAN type game. (No disrespect to pac-man coderz) It is always supposed by OOP supporters that the only pure moral way to write code is OOP and that elite developers do only OOP.
-
But also, with the failure rate of software I'm glad we don't build bridges and skyscrapers. :laugh:
Real programmers use butterflies
honey the codewitch wrote:
But also, with the failure rate of software I'm glad we don't build bridges and skyscrapers.
The % of "so-called" programmers is much, much, much bigger than the % of architechts.
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
-
I think I probably muddied my point with my lament at the end about coding being worse off for OO. It was intended as a kind of rye way of saying it's been overused so much that maybe it has been harmful overall. I use OO myself where I find it's appropriate. My post shouldn't be read as a universal condemnation of it. It's more about how it's often used.
Real programmers use butterflies
TLDR; Most excesses are bad.
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
-
I know what aspects are. To be clear: Where they tie in here, is you create "services" for your classes - implemented as templates that take your own class as a template argument. Those allow you to encapsulate orthogonal class services to a degree. It doesn't always allow for complex cross-cutting scenarios, and arbitrarily wrapping methods with pre and post code takes some vtbl foolery (microsoft takes this approach with certain ATL internals) but you get your aspects this way. It's kind of primitive and just as "kludgy but powerful" as templates are. Between the above and using type traits and such you can get it to do a lot of what you would do with aspect oriented builtins in a higher level language
Real programmers use butterflies
I think I see what you're getting at. Is there an example that you can point to? If I'm guessing correctly, it would probably fit in well with what Sutter suggests in Virtuality[^]. If every virtual function is private and invoked by a non-virtual function that is public, that provides a place to add pre- and post-code. But it would certainly have some limitations.
Robust Services Core | Software Techniques for Lemmings | Articles
-
> I'm curious as to what you meant by C++ changing your attitude towards objects. Maybe he meant that C++ is multiparadigm programming language as opposed to Java which selling point was that it's true and only OOP? "I actually never said that [C++ is an object oriented language]. It's always quoted, but I never did. I said that C++ supports object oriented programming and other techniques." Bjarne Stroustrup, Artificial intelligence Podcast 42:20 > What if it had to be big? Do it in OOP and you will make it big. "Once you reach a particular size, anything beyond that is no longer a reflection of functionality." Kevlin Henney, GOTO 2016 • Small Is Beautiful 55:40 The Facebook iOS app has over 18000 classes. How do you compare it to Quake 3 that can render 30FPS of a 3D world on Pentium 3? My guess is that the Quake team could have developed that fb app with 5 or no classes in 10x less time, the app being 10x less buggy and working at 10x speed vs the current iOS fb app. At the same time the iOS FB team could hardly construct a PAC-MAN type game. (No disrespect to pac-man coderz) It is always supposed by OOP supporters that the only pure moral way to write code is OOP and that elite developers do only OOP.
sickfile wrote:
Do it in OOP and you will make it big.
You might have a lot of boilerplate, which is a PITA but not "big". It would be big if, say, functional programming was a much better fit for the problem.
Quote from Kevlin Henney:
"Once you reach a particular size, anything beyond that is no longer a reflection of functionality."
It's often true that inside a big system, there's a small system struggling to get out. But as a blanket statement, this quote is just a platitude.
sickfile wrote:
The Facebook iOS app has over 18000 classes. How do you compare it to Quake 3 that can render 30FPS of a 3D world on Pentium 3?
18000 classes is a joke. But you can't compare it to the portion of a game that renders graphics, which is highly algorithmic and doesn't require much OO, although your point might be that this wouldn't stop some people from trying to do it that way.
Robust Services Core | Software Techniques for Lemmings | Articles
-
But also, with the failure rate of software I'm glad we don't build bridges and skyscrapers. :laugh:
Real programmers use butterflies
Software failure isn't caused by the tools or paradigms used to develop it - it's cased the the programmers not doing it right.
".45 ACP - because shooting twice is just silly" - JSOP, 2010
-----
You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
-----
When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013 -
Software failure isn't caused by the tools or paradigms used to develop it - it's cased the the programmers not doing it right.
".45 ACP - because shooting twice is just silly" - JSOP, 2010
-----
You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
-----
When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013#realJSOP wrote:
it's caused by the the programmers not doing it right.
exactly.
-
sickfile wrote:
Do it in OOP and you will make it big.
You might have a lot of boilerplate, which is a PITA but not "big". It would be big if, say, functional programming was a much better fit for the problem.
Quote from Kevlin Henney:
"Once you reach a particular size, anything beyond that is no longer a reflection of functionality."
It's often true that inside a big system, there's a small system struggling to get out. But as a blanket statement, this quote is just a platitude.
sickfile wrote:
The Facebook iOS app has over 18000 classes. How do you compare it to Quake 3 that can render 30FPS of a 3D world on Pentium 3?
18000 classes is a joke. But you can't compare it to the portion of a game that renders graphics, which is highly algorithmic and doesn't require much OO, although your point might be that this wouldn't stop some people from trying to do it that way.
Robust Services Core | Software Techniques for Lemmings | Articles
Greg Utas wrote:
although your point might be that this wouldn't stop some people from trying to do it that way.
as in IoT... only because you can, doesn't mean you should :rolleyes: :-D
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
-
Disclaimer: Unpopular opinion A lot of coders spend a lot of lines of code dividing things into tiny steps which they then make whole classes for and abstract everything to the Nth degree, often even when the abstraction is not helpful. Back when I was a green coder, I used to write OO code somewhat like this. Then C++ changed me. I stopped relying on objects so much. This bled over into other languages. Now my code is about expedience. For example, I created a little HTTP server that does the request/response cycle in a single method, with two support structs instead of a dozen classes. My code is smaller, faster, easy enough to understand if you aren't a beginner and overall better for it. It's getting to the point where I think OO is an ill conceived paradigm - and not even because it's Broken As Designed (it's not) but because it gets way overused to the point where the dev world may have been better off with something else.
Real programmers use butterflies
I have a Console application that disagrees :D
using System;
namespace ConsoleApp1
{
class Program
{
static void Main()
{
IMessageGetter messageGetter = new BoohCodewitchMessageGetter();
IMessagePrinter messagePrinter = new ConsoleMessagePrinter();
IInputAwaiter inputAwaiter = new ConsoleInputAwaiter();
string message = messageGetter.GetMessage();
messagePrinter.PrintMessage(message);
inputAwaiter.AwaitInput();
}
}public interface IMessageGetter { string GetMessage(); } public interface IMessagePrinter { void PrintMessage(string message); } public interface IInputAwaiter { void AwaitInput(); } public abstract class BaseMessageGetter : IMessageGetter { public abstract string GetMessage(); } public abstract class BaseMessagePrinter : IMessagePrinter { public abstract void PrintMessage(string message); } public abstract class BaseInputAwaiter : IInputAwaiter { public abstract void AwaitInput(); } public class BoohCodewitchMessageGetter : BaseMessageGetter { public override string GetMessage() => "Booh codewitch, your opinion sucks!"; } public class ConsoleMessagePrinter : BaseMessagePrinter { public override void PrintMessage(string message) => Console.WriteLine(message); } public class ConsoleInputAwaiter : BaseInputAwaiter { public override void AwaitInput() => Console.ReadKey(); }
}
Best, Sander sanderrossel.com Migrating Applications to the Cloud with Azure arrgh.js - Bringing LINQ to JavaScript Object-Oriented Programming in C# Succinctly
-
I have a Console application that disagrees :D
using System;
namespace ConsoleApp1
{
class Program
{
static void Main()
{
IMessageGetter messageGetter = new BoohCodewitchMessageGetter();
IMessagePrinter messagePrinter = new ConsoleMessagePrinter();
IInputAwaiter inputAwaiter = new ConsoleInputAwaiter();
string message = messageGetter.GetMessage();
messagePrinter.PrintMessage(message);
inputAwaiter.AwaitInput();
}
}public interface IMessageGetter { string GetMessage(); } public interface IMessagePrinter { void PrintMessage(string message); } public interface IInputAwaiter { void AwaitInput(); } public abstract class BaseMessageGetter : IMessageGetter { public abstract string GetMessage(); } public abstract class BaseMessagePrinter : IMessagePrinter { public abstract void PrintMessage(string message); } public abstract class BaseInputAwaiter : IInputAwaiter { public abstract void AwaitInput(); } public class BoohCodewitchMessageGetter : BaseMessageGetter { public override string GetMessage() => "Booh codewitch, your opinion sucks!"; } public class ConsoleMessagePrinter : BaseMessagePrinter { public override void PrintMessage(string message) => Console.WriteLine(message); } public class ConsoleInputAwaiter : BaseInputAwaiter { public override void AwaitInput() => Console.ReadKey(); }
}
Best, Sander sanderrossel.com Migrating Applications to the Cloud with Azure arrgh.js - Bringing LINQ to JavaScript Object-Oriented Programming in C# Succinctly
*headdesk*
Real programmers use butterflies
-
Disclaimer: The Big Brother is watching you! There was a time when at least you could lose your job for such claims, at worst you could have get killed by an angry mob of mostly rookie developers who want to show of. I remember how much impressed I was with multiple inheritance, assignment overloading and copy constructors... One day I realized what I have always known as a kid. Programming is data processing. "in C++ as in Simula a class is a user defined type." "Every language that uses the word class, for type, is a descendent of Simula" Bjarne Stroustrup They should have called OOP - class oriented developing, because it's appealing to class obsessed chauvinists. Contrary to popular belief, objects are only data. You could have a pointer to an array of pointers to functions here and there or a reference to a function, but that's data too. No matter what language you use it all gets down to the same assembly language. Even before that, in the compilation process, programs are translated to a common language neutral data representation. So, for EVERY program in Java you could write a program in C that gets translated into the same assembly code the CPU will execute. But, you could hardly write a Java program for ANY C program that will be translated into the same assembly code. "The very first Java compiler was developed by Sun Microsystems and was written in C using some libraries from C++. Today, the Java compiler is written in Java, while the JRE is written in C." "The Sun JVM is written in C" Provided as is from Stackoverflow. C implements Java, but Java cannot implement C. Back to topic, this is what I find most appealing. "We don’t have a mathematical model for OOP. We have Turing machines for imperative (procedural) programming, lambda-calculus for functional programming and even pi-calculus (and CSP by C.A.R. Hoare again and other variations) for event-based and distributed programming, but nothing for OOP. So the question of “what is a ‘correct’ OO program?”, cannot even be defined; (much less, the answer to that question.)" It was given as an answer at Quora to the question 'Why did Dijkstra say that “Object-oriented programming is an exceptionally bad idea which could only have originated in California.?"' Greetings.
sickfile wrote:
We don’t have a mathematical model for OOP
That's an extremely good point. To be fair, as I've said elsewhere on the thread I use OO in places - like if I expose an API to whatever i'm writing that will often be OO. And I tend to use OO here and there for other reasons when I'm stuck in a hard OO env like Java or C# I limit its use though: 1. Does it help explain the code? 2. Does it work with the rest of the code rather than against it? 3. Does it encapsulate an abstraction such that it makes it simpler to employ? There are so many times when the answers to those questions are no, and I see people using objects. See @SanderRossel's console app upthread - he was ribbing me but it's a good example of class misuse.
Real programmers use butterflies
-
Greg Utas wrote:
although your point might be that this wouldn't stop some people from trying to do it that way.
as in IoT... only because you can, doesn't mean you should :rolleyes: :-D
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
Nelek wrote:
as in IoT
You mean my wifi enabled AI toaster is overkill? :-D
Real programmers use butterflies
-
Software failure isn't caused by the tools or paradigms used to develop it - it's cased the the programmers not doing it right.
".45 ACP - because shooting twice is just silly" - JSOP, 2010
-----
You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
-----
When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013I didn't intend to imply otherwise, despite the overarching topic. I switched gears.
Real programmers use butterflies
-
I think I see what you're getting at. Is there an example that you can point to? If I'm guessing correctly, it would probably fit in well with what Sutter suggests in Virtuality[^]. If every virtual function is private and invoked by a non-virtual function that is public, that provides a place to add pre- and post-code. But it would certainly have some limitations.
Robust Services Core | Software Techniques for Lemmings | Articles
I'd have to dig up some of old code off of my one drive if it's there. i wasn't using github until more recently. People usually don't describe it the way I do. It's a technique I picked up while doing research into making a rather ambitious business integration system with COM+ like features. Someone demonstrated cross cutting functionality using the Curiously recurring template pattern - Wikipedia[^] It gave me one of those aha moments, and since, whenever I see a CRTP like above, i half expect it. Dr. Dobbs has an article about doing cross cutting but they don't use generic programming to do it. <h1>Aspect-Oriented Programming & C++</h1> | Dr Dobb's[^] But if you poke at it you can see there's opportunities for factoring into a template
Real programmers use butterflies
-
TLDR; Most excesses are bad.
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
fair enough. it just seems like a popular thing to do in this case. i see it all over with .NET projects.
Real programmers use butterflies
-
honey the codewitch wrote:
But also, with the failure rate of software I'm glad we don't build bridges and skyscrapers.
The % of "so-called" programmers is much, much, much bigger than the % of architechts.
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
also CS is only somewhat engineering. It's half voodoo. :laugh:
Real programmers use butterflies
-
Nelek wrote:
as in IoT
You mean my wifi enabled AI toaster is overkill? :-D
Real programmers use butterflies
It will be when someone hacks into it to burn your house down. :laugh:
Robust Services Core | Software Techniques for Lemmings | Articles
-
sickfile wrote:
We don’t have a mathematical model for OOP
That's an extremely good point. To be fair, as I've said elsewhere on the thread I use OO in places - like if I expose an API to whatever i'm writing that will often be OO. And I tend to use OO here and there for other reasons when I'm stuck in a hard OO env like Java or C# I limit its use though: 1. Does it help explain the code? 2. Does it work with the rest of the code rather than against it? 3. Does it encapsulate an abstraction such that it makes it simpler to employ? There are so many times when the answers to those questions are no, and I see people using objects. See @SanderRossel's console app upthread - he was ribbing me but it's a good example of class misuse.
Real programmers use butterflies
"We don't have a mathematical model for OOP" sounds like a lament from a formal methods fanboi, in which case the objection can be ignored.
Robust Services Core | Software Techniques for Lemmings | Articles