Why isn't C# more popular?
-
I've experimented with a lot of languages over the years and delivered production code in at least a handful. Of all, I find C# to be the most genial. It has syntax close enough to C/C++ not to alienate those programmers, has plenty of high-level stuff to keep users of languages such as Java happy, and these days has good performance and is open and cross-platform. The only thing I hate about it is the terrible 'destructor' pattern, which you can ignore most of the time. Despite all of this, I rarely if ever read a headline that says C# is gaining in popularity. There's no point in getting into too many syntax specifics because that would be a never-ending discussion but why does it fail to hit the spot with so many developers and companies?
Thank you to anyone taking the time to read my posts.
I am definitively a C# guy, but I generally love programming so I've also learned/used JavaScript, VB.Net, LISP, Python, misc other obscure scripting languages and I am currently taking a C++ class so I can effectively play with some specific SDK's. Learning these languages have each taught me different lessons. There is no correct language, some are just plain easier to deploy in different places. The Microsoft (open source!) .NET 5 gives me some hope that I'll get to use C# more frequently in my future. I certainly don't understand the Anti-Microsoft groups. In very recent years, MS has been a beacon of what I've always fundamentally believed needed to happen; IE, the dismantling of proprietary platforms and file types so that all things have the potential to interoperate. That is the only path to "thriving" in the next era and the companies (regardless of their size) that continue to live in a "black box" state will someday die if they don't focus on being the "open world" stars instead of the "closed world" wardens.
-
I've experimented with a lot of languages over the years and delivered production code in at least a handful. Of all, I find C# to be the most genial. It has syntax close enough to C/C++ not to alienate those programmers, has plenty of high-level stuff to keep users of languages such as Java happy, and these days has good performance and is open and cross-platform. The only thing I hate about it is the terrible 'destructor' pattern, which you can ignore most of the time. Despite all of this, I rarely if ever read a headline that says C# is gaining in popularity. There's no point in getting into too many syntax specifics because that would be a never-ending discussion but why does it fail to hit the spot with so many developers and companies?
Thank you to anyone taking the time to read my posts.
It's simple for me... How much can I trust MSFT to not break their old projects. We have 20-30 and even 50 year old code, still running in production! Have you loaded a .NET 1 project and recompiled it lately? My past experience with MSFT was when they abandoned 16 bit C++ support. We had to buy a Borland C++ Compiler to compile the MSFT 32 bit C++ (new features), into 16 bit code which 90% of our "clients" required as their DLL. Don't get me started on the nightmare we had doing a Mobile Project in C#. We are starting to embrace C# now that Rider exists. It's not the language, it's the ecosystem the language ends up requiring. Both for compiling and for running!
-
I've experimented with a lot of languages over the years and delivered production code in at least a handful. Of all, I find C# to be the most genial. It has syntax close enough to C/C++ not to alienate those programmers, has plenty of high-level stuff to keep users of languages such as Java happy, and these days has good performance and is open and cross-platform. The only thing I hate about it is the terrible 'destructor' pattern, which you can ignore most of the time. Despite all of this, I rarely if ever read a headline that says C# is gaining in popularity. There's no point in getting into too many syntax specifics because that would be a never-ending discussion but why does it fail to hit the spot with so many developers and companies?
Thank you to anyone taking the time to read my posts.
C# is my #1 lang BUT IMO its because of its refusal to acknowledge the JIT is a failed idea so full of baggage & half baked ideas people who still argue for JIT just end up making excuses for why it lags behind at this point. The lang and runtime need to be as one... which is another issue with .NET It takes more time to port giving you slower and more bloated code.
-
I've experimented with a lot of languages over the years and delivered production code in at least a handful. Of all, I find C# to be the most genial. It has syntax close enough to C/C++ not to alienate those programmers, has plenty of high-level stuff to keep users of languages such as Java happy, and these days has good performance and is open and cross-platform. The only thing I hate about it is the terrible 'destructor' pattern, which you can ignore most of the time. Despite all of this, I rarely if ever read a headline that says C# is gaining in popularity. There's no point in getting into too many syntax specifics because that would be a never-ending discussion but why does it fail to hit the spot with so many developers and companies?
Thank you to anyone taking the time to read my posts.
Well when it started and for a long time after there was an Open Source hostile environment at Microsoft. Ballmer did not do it any favors for sure. Java was and is more widely supported due to it embracing the open source community and consequently having hundreds of ways to do any one thing. Now Java has inertia on its side and C# still is in its infancy (comparatively) as far as open source is concerned. Basically Microsoft bit itself in the butt.
-
Rick York wrote:
deleting what you allocate is far too much to ask of programmers
Actually it is. "Use after free" is one of the biggest security risks in C/C++. Also, not removing what's no longer in use eventually leads to memory exhaustion. These two situations have been known issues since at least 1958 when Lisp was first developed. This is also why all high level business languages, as opposed to embedded or operating system development, contain at least memory garbage collection. Almost all early languages (COBOL, BASIC, FORTRAN, APL, Algol, etc.) have some concept of garbage collection for some data types. What changes with Java was that all data types are now garbage collected unless the programmer explicitly tell the compiler not to do so.
-
I've experimented with a lot of languages over the years and delivered production code in at least a handful. Of all, I find C# to be the most genial. It has syntax close enough to C/C++ not to alienate those programmers, has plenty of high-level stuff to keep users of languages such as Java happy, and these days has good performance and is open and cross-platform. The only thing I hate about it is the terrible 'destructor' pattern, which you can ignore most of the time. Despite all of this, I rarely if ever read a headline that says C# is gaining in popularity. There's no point in getting into too many syntax specifics because that would be a never-ending discussion but why does it fail to hit the spot with so many developers and companies?
Thank you to anyone taking the time to read my posts.
I agree. C++ is a nightmare (and I've been programming in C++ years longer than C#). C++'s error messages are idiotic; sometimes they're only useful for identifying the line an error occurred on. For years, I assumed it had to be this way; spend hours pouring over "correct" code wondering why the compiler is complaining. Then my first substantial C# program compiled and linked almost the first time! No crazy "errors" about correct code! The error messages made sense! With C# you generally don't need third-party packages. A typical C++ project at work requires separate third-party packages (with all their quirks, complexity, and bad documentation) for: 1. The GUI 2. Processing XML 3. Encryption 4. Open SSL C# catches many memory errors. On a project I did the C# end and a team in Germany did the C++ end. I mentioned to one of them that C# catches memory errors, and got the arrogant reply "There are no memory errors in my code!". Fast forward a few weeks. Their end wasn't working: Memory errors. I took the opportunity to replace what they were doing, and their whole team (four of them) was taken off the project (and reassigned). I got a stock grant for saving the company money. I'd choose C# over C++ for almost anything.
-
Nonsense. If that is too much to ask of a programmer then they need to find another line of work.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
A big part of the problem is that it’s not always clear who bears the responsibility for releasing the allocation. If you think otherwise, perhaps it’s you who need to consider alternate careers. Or prepare yourself for a big shock if you are just getting started and have just assumed it is that simple.
-
A big part of the problem is that it’s not always clear who bears the responsibility for releasing the allocation. If you think otherwise, perhaps it’s you who need to consider alternate careers. Or prepare yourself for a big shock if you are just getting started and have just assumed it is that simple.
-
If it isn't clear then you aren't doing it right.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
Exactly the kind of rebuttal I would expect from someone who doesn’t have a lot of experience. Your pronouncement would be more defensible if you had written “somebody” didn’t do it right, but it’s not necessarily the someone who is writing code today and the question of what exactly “it” is has a couple of potential answers. It could be, for example, that a library author meant to conform to an specific predefined protocol and failed. Or it could be they were implementing something new and the documentation they provided is incomplete or incorrect. In especially old code, perhaps they *were* correctly following a known protocol but the protocol itself ended up redefined. Or one of my favorites: a library has multiple functions that accept an allocation as a parameter. Some consume the allocation and others just reference it, and there’s a convention to help you as the library user recognize which are which. But also there’s an old function that doesn’t follow the convention, its behavior is grandfathered in due to being used in existing systems and the footnote mentioning this legacy deviation is cropped off the bottom of the photocopied documentation you were given. I’ve run into all of those scenarios in large scale production systems that I was trying to interface with. It’s easy to make a simplistic assertion that the only reason this is an issue is that somewhere, sometime, somebody did something wrong. You may be 100% correct about that. But you’re making the very point you’re arguing against. Things like this absolutely happen, and it is in real life one of the most common sources of program misbehavior. We know from decades of experience that this *will* go wrong and that it *will* result in system instability and/or security exposures. So we can cross our fingers and hope after all this time as systems continue to increase in complexity that coders as a population will become perfect at it, or we can automate this tedious, error-prone task for essentially perfect behavior today and let developers spend their time and energy on the real meat of their projects.
-
It's simple for me... How much can I trust MSFT to not break their old projects. We have 20-30 and even 50 year old code, still running in production! Have you loaded a .NET 1 project and recompiled it lately? My past experience with MSFT was when they abandoned 16 bit C++ support. We had to buy a Borland C++ Compiler to compile the MSFT 32 bit C++ (new features), into 16 bit code which 90% of our "clients" required as their DLL. Don't get me started on the nightmare we had doing a Mobile Project in C#. We are starting to embrace C# now that Rider exists. It's not the language, it's the ecosystem the language ends up requiring. Both for compiling and for running!
I can't argue with your real-world experience but I once had to compile some 90s C code (i.e. not that old) and it took me ages to get it to compile with either the MS compiler or GCC. I spent years writing C++ in video games and am well aware of its positives and negatives and why it is still essential for so many projects. All I am saying is that, in my opinion, C# is a better language than surveys would suggest and I agree with many other comments on here that argue there is bias and prejudice in the system.
Thank you to anyone taking the time to read my posts.
-
Yes, deleting allocated objects is one of the places that find lots of bugs. Many of these bugs become very difficult to track down. Slow memory leakage is something that eats up lots of support time and turns off many customers.
Yes, I know. That's why I make sure I never have those bugs. My software has to run non-stop for months and months so leaks and errors of any kind always show up.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
-
I've experimented with a lot of languages over the years and delivered production code in at least a handful. Of all, I find C# to be the most genial. It has syntax close enough to C/C++ not to alienate those programmers, has plenty of high-level stuff to keep users of languages such as Java happy, and these days has good performance and is open and cross-platform. The only thing I hate about it is the terrible 'destructor' pattern, which you can ignore most of the time. Despite all of this, I rarely if ever read a headline that says C# is gaining in popularity. There's no point in getting into too many syntax specifics because that would be a never-ending discussion but why does it fail to hit the spot with so many developers and companies?
Thank you to anyone taking the time to read my posts.
Here's my $0.02 .netcore is just not that popular for corss platform stuff, and is not supported on that many platforms; basically they missed the opportunity that Java didn't, in that regard, and they didn't have their "Android" booster as Java did. Performance wise C# doesn't compete with C++. It's not as easy to learn as Python is. It's not a popular language for game development, Unity is at best a platform to learn game programming, I haven't seen any AAA game made with it. It's used mainly for insanely boring projects, like medium complexity Windows specific enterprise apps, it isn't really a goto language/platform for multimedia, machine learning, gamedev or math heavy stuff. Really, if you look at what C# devs are being hired for, you can get depressed.
-
because it's an MS product.
The first correct (IMO) answer
Java, Basic, who cares - it's all a bunch of tree-hugging hippy cr*p
-
I've experimented with a lot of languages over the years and delivered production code in at least a handful. Of all, I find C# to be the most genial. It has syntax close enough to C/C++ not to alienate those programmers, has plenty of high-level stuff to keep users of languages such as Java happy, and these days has good performance and is open and cross-platform. The only thing I hate about it is the terrible 'destructor' pattern, which you can ignore most of the time. Despite all of this, I rarely if ever read a headline that says C# is gaining in popularity. There's no point in getting into too many syntax specifics because that would be a never-ending discussion but why does it fail to hit the spot with so many developers and companies?
Thank you to anyone taking the time to read my posts.
I was part-way through a professional C++ course, just beginning Windows programming, when C# first came out and the course switched to C#, which annoyingly meant an extra year of study. I liked it at first, though the class library (v.1) was crap in places, and the push to make XML universal flopped (thankfully), but at least the language was nice and ideal for Windows apps. Then windows apps went out of fashion and C# kept pumping out new versions (in that irritating MS way). The class libraries certainly improved but they made too many fundamental changes to the language for my liking and I found myself wondering if I was coding anything in the 'right way'. Then I discovered Java and never looked back, although it is not great for the web, I use JS and PHP for that (I don't like ASP.NET). Java suites my intuition and with it the code just flows, I don't have to worry about all the quirks of C# and its often cumbersome syntax. C# meanwhile has focused on ASP.NET which I absolutely hate and could never get into. C# does have some nice features though and I continued with it for some years just on a hobby basis, but I haven't bothered with it for the past few years now.