How egregious is my crime to consider unmanaged code?
-
We have a ton of existing C++ code I can reuse to build an n-tier enterprise managment solution. Does the lounge consider it a crime to consider using unmanaged C++ for a new project?
Do we weigh less at high tide?
If you write code for money then there is only one crime: spending too much money to make too little money. I wouldn't start any new project in unmanaged code but if I had a huge investment in an existing library I'd use it as is, there's no profit in re-inventing the wheel.
When everyone is a hero no one is a hero.
-
chris ruff wrote:
Does the lounge consider it a crime to consider using unmanaged C++ for a new project?
Nope. These newfangled languages are poorly architected, the supporting frameworks are kludgy and buggy, and the designers appear to be syntactical sugared and eye candied, to say the least. I'm dead serious. C# is cool and thoroughly enjoy the n-tier architecture that I build with it, but I'm depressed by what they're doing to the language and the framework. I continually encounter areas of the framework that don't come up to snuff performance-wise when working in an n-tier environment or that are so dumbed down as to be unusable, either case requiring a replacement of what .NET provides. I'm disappointed with the language enhancements, feeling that there is no roadmap other than "screw everybody else's ideas, but don't admit they even have ideas because we're going to hijack them anyways." The changes to C++ that someone posted about a week or so ago, that the Intel compiler supports, that's stuff that gets me wishing I'd developed Interacx in C++. Seriously. Marc
Marc Clifton wrote:
don't come up to snuff performance-wise when working in an n-tier environment
Ok, I seriously don't want to argue the merits of managed code in any way, but managed n-tier development is an area I'm deeply involved in and I can't fathom what you're saying here. Performance has never been an issue for me, it scales beautifully, where do you get this from?
When everyone is a hero no one is a hero.
-
Dario Solera wrote:
although fast hardware is a commodity nowadays.
That should never be a consideration when designing your code.
A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects. - -Lazarus Long
That should *always* be a consideration when designing your code. Designing software isn't about dogmatic adherence to rules and regulations. You have a need, you take *everything* into consideration when designing a solution, the best design is the best design. I could quickly go out of business spending an inordinate amount of time fiddling around trying eke out the last millisecond of performance while my end users languished waiting for an update. People who have a serious need for software will gladly supply any hardware required because it *is* in fact a commodity these days.
When everyone is a hero no one is a hero.
-
El Corazon wrote:there are those who believe C++ is dead, and those who do not. You are about to find out who. Read that as "there are those who bought the .NET marketing and advertising hype, and those who did not."
-
That should *always* be a consideration when designing your code. Designing software isn't about dogmatic adherence to rules and regulations. You have a need, you take *everything* into consideration when designing a solution, the best design is the best design. I could quickly go out of business spending an inordinate amount of time fiddling around trying eke out the last millisecond of performance while my end users languished waiting for an update. People who have a serious need for software will gladly supply any hardware required because it *is* in fact a commodity these days.
When everyone is a hero no one is a hero.
and people with a need for performance that can't be solved by just throwing hardware at it will pay extra for C++ with hand assembled hotloops.
Otherwise [Microsoft is] toast in the long term no matter how much money they've got. They would be already if the Linux community didn't have it's head so firmly up it's own command line buffer that it looks like taking 15 years to find the desktop. -- Matthew Faithfull
-
and people with a need for performance that can't be solved by just throwing hardware at it will pay extra for C++ with hand assembled hotloops.
Otherwise [Microsoft is] toast in the long term no matter how much money they've got. They would be already if the Linux community didn't have it's head so firmly up it's own command line buffer that it looks like taking 15 years to find the desktop. -- Matthew Faithfull
dan neely wrote:
and people with a need for performance that can't be solved by just throwing hardware at it will pay extra for C++ with hand assembled hotloops
I'm sure they will and they are likely not people in the market for consumer or business software. I can't conceive of any program that can't perform faster with faster hardware, care to enlighten us? Or are you saying it's already running on the biggest cluster on the planet and is still too slow, perhaps a weather simulator or something?
When everyone is a hero no one is a hero.
-
That should *always* be a consideration when designing your code. Designing software isn't about dogmatic adherence to rules and regulations. You have a need, you take *everything* into consideration when designing a solution, the best design is the best design. I could quickly go out of business spending an inordinate amount of time fiddling around trying eke out the last millisecond of performance while my end users languished waiting for an update. People who have a serious need for software will gladly supply any hardware required because it *is* in fact a commodity these days.
When everyone is a hero no one is a hero.
Sloppy is sloppy and slow is slow.
John C wrote:
You have a need, you take *everything* into consideration when designing a solution, the best design is the best design.
What does hardware have to do with quality software?
John C wrote:
I could quickly go out of business spending an inordinate amount of time fiddling around trying eke out the last millisecond of performance while my end users languished waiting for an update.
No on even mentioned that. I am simply stating that relying of hardware to hide the effects of ill conceived code is a non starter. Good hardware is no excuse for sloppy and slow products.
John C wrote:
People who have a serious need for software will gladly supply any hardware required because it *is* in fact a commodity these days.
Not the people I have and want as customers.
A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects. - -Lazarus Long
-
It's a bit late in the game to attribute the success of .net solely to marketing and advertising hype. :rolleyes:
When everyone is a hero no one is a hero.
John C wrote:
the success of .net
Have there been any 100% .net major desktop products?
A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects. - -Lazarus Long
-
Sloppy is sloppy and slow is slow.
John C wrote:
You have a need, you take *everything* into consideration when designing a solution, the best design is the best design.
What does hardware have to do with quality software?
John C wrote:
I could quickly go out of business spending an inordinate amount of time fiddling around trying eke out the last millisecond of performance while my end users languished waiting for an update.
No on even mentioned that. I am simply stating that relying of hardware to hide the effects of ill conceived code is a non starter. Good hardware is no excuse for sloppy and slow products.
John C wrote:
People who have a serious need for software will gladly supply any hardware required because it *is* in fact a commodity these days.
Not the people I have and want as customers.
A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects. - -Lazarus Long
Chris Austin wrote:
I am simply stating that relying of hardware to hide the effects of ill conceived code is a non starter
Then maybe you should have written that in the first place. :)
Chris Austin wrote:
What does hardware have to do with quality software?
You must work in a different world than I do, hardware has everything to do with software. It's not about sloppy code, it's about what you can do with the level of hardware your customers are likely to have or want to pay for. Given infinite processing power there's a lot I could do that I can't do right now in the software I write. It's a balance like everything else. I've found it useful to assume when I read a post and reply to it that the person who wrote it is *not* an idiot and work from there. I.E. did you really think I was advocating sloppy slow code development covered up with faster hardware?
When everyone is a hero no one is a hero.
-
Chris Austin wrote:
I am simply stating that relying of hardware to hide the effects of ill conceived code is a non starter
Then maybe you should have written that in the first place. :)
Chris Austin wrote:
What does hardware have to do with quality software?
You must work in a different world than I do, hardware has everything to do with software. It's not about sloppy code, it's about what you can do with the level of hardware your customers are likely to have or want to pay for. Given infinite processing power there's a lot I could do that I can't do right now in the software I write. It's a balance like everything else. I've found it useful to assume when I read a post and reply to it that the person who wrote it is *not* an idiot and work from there. I.E. did you really think I was advocating sloppy slow code development covered up with faster hardware?
When everyone is a hero no one is a hero.
John C wrote:
You must work in a different world than I do,.......
Obviously. To me an my customers performance matters.
John C wrote:
did you really think I was advocating sloppy slow code development covered up with faster hardware?
Yes. Anytime someone says something to the effect "of don't worry if it is slow, you can always upgrade your hardware" alarms go off in my mind.
A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects. - -Lazarus Long
-
Marc Clifton wrote:
don't come up to snuff performance-wise when working in an n-tier environment
Ok, I seriously don't want to argue the merits of managed code in any way, but managed n-tier development is an area I'm deeply involved in and I can't fathom what you're saying here. Performance has never been an issue for me, it scales beautifully, where do you get this from?
When everyone is a hero no one is a hero.
John C wrote:
Performance has never been an issue for me, it scales beautifully, where do you get this from?
Perhaps there a difference between what is meant by "managed code" and "managed n-tier"? In any case, I've found the most woesome problems with serialization. The DataTable is incredibly bloated and the BinaryFormatter is not a true serializer, as it keeps everything in memory until the process is complete, and itself contributes to bloat because it doesn't actually result in binary data. Those two artifacts alone make major components of the .NET framework unscalable. Marc
-
dan neely wrote:
and people with a need for performance that can't be solved by just throwing hardware at it will pay extra for C++ with hand assembled hotloops
I'm sure they will and they are likely not people in the market for consumer or business software. I can't conceive of any program that can't perform faster with faster hardware, care to enlighten us? Or are you saying it's already running on the biggest cluster on the planet and is still too slow, perhaps a weather simulator or something?
When everyone is a hero no one is a hero.
Scientific computing is one of the big markets. All doubling the horse power would do for the team would be for them to either double the size or number of models run to keep it at 100% load. Writing in C++ and doing the hot loop in assembly is cheaper than buying a few hundred or a few thousand more blades for the cluster. Distributed computing projects don't even have the buy hardware option at all. Gaming is the other. Consoles have fixed hardware specs. While PCs don't most PC gamers are already running the fastest hardware they can justify buying. Again 'buy something faster' isn't an option. Similar arguments can apply to really large enterprisey systems. Most of the time though writing managed code and throwing a 2nd server at it is cheaper though.
Otherwise [Microsoft is] toast in the long term no matter how much money they've got. They would be already if the Linux community didn't have it's head so firmly up it's own command line buffer that it looks like taking 15 years to find the desktop. -- Matthew Faithfull
-
dan neely wrote:
and people with a need for performance that can't be solved by just throwing hardware at it will pay extra for C++ with hand assembled hotloops
I'm sure they will and they are likely not people in the market for consumer or business software. I can't conceive of any program that can't perform faster with faster hardware, care to enlighten us? Or are you saying it's already running on the biggest cluster on the planet and is still too slow, perhaps a weather simulator or something?
When everyone is a hero no one is a hero.
John C wrote:
and they are likely not people in the market for consumer
very much not true! game market is driving harder and faster than military and business. If you aren't bending to gamers hardware, you are falling behind because hardware is shifting to gamers hard... so to speak. Business driving hardware is only as you said, the program is innefficient you throw larger iron at it until it works and then you leave it cooking for a year or two. Gamers upgrade regularly and create a lot of income. The hardware market is bending to gamers, the software market is leveraging the advantage that gamers send their way. Honest, if you are not taking advantage of one of the largest CONSUMER level software markets you may well be outdated in two generations of hardware. SLI is here because of gamers, shaders are here because of gamers, multi-core is here because of gamers, SATA is here because of gamers. We are driven by a HUGE consumer level software market, that is by far driven by efficient and filled with massive content. Even the military is wise enough to nod their heads and say, "that is a big market, how can we take advantage of it to our benefit." Simply saying it is old and worn out never makes it so.
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
John C wrote:
You must work in a different world than I do,.......
Obviously. To me an my customers performance matters.
John C wrote:
did you really think I was advocating sloppy slow code development covered up with faster hardware?
Yes. Anytime someone says something to the effect "of don't worry if it is slow, you can always upgrade your hardware" alarms go off in my mind.
A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects. - -Lazarus Long
*Acceptible* performance always matters it's a given, most customers would expect it, they don't expect to have to wait for anything and it's damned hard to write any software that makes a person wait, you have to have a hugely inept design. What they want to see are features and usability that meet their needs. If a programmer working for me spent all their time eking out milliseconds in the code and not concentrating on supporting the users expectations of how the software should work I'd fire their ass in a heartbeat. Bit fiddling like that in this modern age of super fast off the shelf bargain basement priced hardware is utterly meaningless. It was a huge consideration a decade or more ago it simply isn't as much of a factor any more. Very few if any seasoned developers would even start down a path that is blatantly unperformant. In a commercial software business your main goal is to make money, you do that with popular, easy to use, well supported software that has the *features* that people want and need. Performance is not a *feature* it's a given fundamental, it's like saying "but it must run on a modern computer". Hardware scalability is a feature of modern applications and database management systems, not a band aid.
When everyone is a hero no one is a hero.
-
John C wrote:
Performance has never been an issue for me, it scales beautifully, where do you get this from?
Perhaps there a difference between what is meant by "managed code" and "managed n-tier"? In any case, I've found the most woesome problems with serialization. The DataTable is incredibly bloated and the BinaryFormatter is not a true serializer, as it keeps everything in memory until the process is complete, and itself contributes to bloat because it doesn't actually result in binary data. Those two artifacts alone make major components of the .NET framework unscalable. Marc
Marc Clifton wrote:
and the BinaryFormatter is not a true serializer, as it keeps everything in memory until the process is complete, and itself contributes to bloat because it doesn't actually result in binary data. Those two artifacts alone make major components of the .NET framework unscalable.
check out the work by the motion picture standards board there isn't much here http://en.wikipedia.org/wiki/KLV[^] but the same disappointments with binary serialization have driven KLV standards back into the general marketplace. You can expect to see it taking over much of the streaming protocols on the internet soon.
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
John C wrote:
Performance has never been an issue for me, it scales beautifully, where do you get this from?
Perhaps there a difference between what is meant by "managed code" and "managed n-tier"? In any case, I've found the most woesome problems with serialization. The DataTable is incredibly bloated and the BinaryFormatter is not a true serializer, as it keeps everything in memory until the process is complete, and itself contributes to bloat because it doesn't actually result in binary data. Those two artifacts alone make major components of the .NET framework unscalable. Marc
Hmm..I've never tried to serialize a DataTable, just my own business object classes. We have a completely managed n-tier design that we know scales very well from testing and real world use based on a modified version of Rocky Lhotka's business object framework that is pretty widely used. We support a remote dataportal configuration which serializes data between a user and a remote IIS server. Aside from the wire transfer overhead there is very little noticeable to the user difference in performance. I'll admit I've not tried it with some super high number of test users like 10,000 or something but from other users experience with the framework they've reported that it's not an issue with sufficient hardware.
When everyone is a hero no one is a hero.
-
Scientific computing is one of the big markets. All doubling the horse power would do for the team would be for them to either double the size or number of models run to keep it at 100% load. Writing in C++ and doing the hot loop in assembly is cheaper than buying a few hundred or a few thousand more blades for the cluster. Distributed computing projects don't even have the buy hardware option at all. Gaming is the other. Consoles have fixed hardware specs. While PCs don't most PC gamers are already running the fastest hardware they can justify buying. Again 'buy something faster' isn't an option. Similar arguments can apply to really large enterprisey systems. Most of the time though writing managed code and throwing a 2nd server at it is cheaper though.
Otherwise [Microsoft is] toast in the long term no matter how much money they've got. They would be already if the Linux community didn't have it's head so firmly up it's own command line buffer that it looks like taking 15 years to find the desktop. -- Matthew Faithfull
-
Scientific computing is one of the big markets. All doubling the horse power would do for the team would be for them to either double the size or number of models run to keep it at 100% load. Writing in C++ and doing the hot loop in assembly is cheaper than buying a few hundred or a few thousand more blades for the cluster. Distributed computing projects don't even have the buy hardware option at all. Gaming is the other. Consoles have fixed hardware specs. While PCs don't most PC gamers are already running the fastest hardware they can justify buying. Again 'buy something faster' isn't an option. Similar arguments can apply to really large enterprisey systems. Most of the time though writing managed code and throwing a 2nd server at it is cheaper though.
Otherwise [Microsoft is] toast in the long term no matter how much money they've got. They would be already if the Linux community didn't have it's head so firmly up it's own command line buffer that it looks like taking 15 years to find the desktop. -- Matthew Faithfull
dan neely wrote:
While PCs don't most PC gamers are already running the fastest hardware they can justify buying.
and many of them are overclocking that hardware to push it right to the breaking point. With 3.4 lb (1.5kg) and 3x140mm fans for air cooled to phase change to Thermo electric cooling, folks are pushing the 5Ghz boundary in the gaming market already even though no commercial place is gutsy enough to sell it, gamers are pushing the hardware over the line and past the commercial level, WELL beyond the commercial level... and they are a huge consumer market!!
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
Sure those are all cycle hungry users, I thought we were talking about off the shelf kind of software.
When everyone is a hero no one is a hero.
John C wrote:
I thought we were talking about off the shelf kind of software.
well, you know john... you must be right ... I have to special order my copy of games... because walmart just won't stock games for cycle hungry users.... They ripped out the game section and replaced it with extra aisle of bath soap... go check yours and see if it is the same.... :rolleyes:
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
-
Marc Clifton wrote:
and the BinaryFormatter is not a true serializer, as it keeps everything in memory until the process is complete, and itself contributes to bloat because it doesn't actually result in binary data. Those two artifacts alone make major components of the .NET framework unscalable.
check out the work by the motion picture standards board there isn't much here http://en.wikipedia.org/wiki/KLV[^] but the same disappointments with binary serialization have driven KLV standards back into the general marketplace. You can expect to see it taking over much of the streaming protocols on the internet soon.
_________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)
El Corazon wrote:
You can expect to see it taking over much of the streaming protocols on the internet soon.
Fine with me. :) The other interesting thing about KLV is that if you partition the KL from the V, you can get decent compression on the KL part. The KL part would need to be lossless, but the V part, especially for streaming audio/video, can then utilize a lossy compression algorithm. Of course, this requires more preprocessing on the front end because you have to first update all the L's after applying the compression before sending the KL packet. Marc