Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. How egregious is my crime to consider unmanaged code?

How egregious is my crime to consider unmanaged code?

Scheduled Pinned Locked Moved The Lounge
c++cssquestion
72 Posts 18 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • M Member 96

    Marc Clifton wrote:

    don't come up to snuff performance-wise when working in an n-tier environment

    Ok, I seriously don't want to argue the merits of managed code in any way, but managed n-tier development is an area I'm deeply involved in and I can't fathom what you're saying here. Performance has never been an issue for me, it scales beautifully, where do you get this from?


    When everyone is a hero no one is a hero.

    M Offline
    M Offline
    Marc Clifton
    wrote on last edited by
    #40

    John C wrote:

    Performance has never been an issue for me, it scales beautifully, where do you get this from?

    Perhaps there a difference between what is meant by "managed code" and "managed n-tier"? In any case, I've found the most woesome problems with serialization. The DataTable is incredibly bloated and the BinaryFormatter is not a true serializer, as it keeps everything in memory until the process is complete, and itself contributes to bloat because it doesn't actually result in binary data. Those two artifacts alone make major components of the .NET framework unscalable. Marc

    Thyme In The Country Interacx My Blog

    E M 2 Replies Last reply
    0
    • M Member 96

      dan neely wrote:

      and people with a need for performance that can't be solved by just throwing hardware at it will pay extra for C++ with hand assembled hotloops

      I'm sure they will and they are likely not people in the market for consumer or business software. I can't conceive of any program that can't perform faster with faster hardware, care to enlighten us? Or are you saying it's already running on the biggest cluster on the planet and is still too slow, perhaps a weather simulator or something?


      When everyone is a hero no one is a hero.

      D Offline
      D Offline
      Dan Neely
      wrote on last edited by
      #41

      Scientific computing is one of the big markets. All doubling the horse power would do for the team would be for them to either double the size or number of models run to keep it at 100% load. Writing in C++ and doing the hot loop in assembly is cheaper than buying a few hundred or a few thousand more blades for the cluster. Distributed computing projects don't even have the buy hardware option at all. Gaming is the other. Consoles have fixed hardware specs. While PCs don't most PC gamers are already running the fastest hardware they can justify buying. Again 'buy something faster' isn't an option. Similar arguments can apply to really large enterprisey systems. Most of the time though writing managed code and throwing a 2nd server at it is cheaper though.

      Otherwise [Microsoft is] toast in the long term no matter how much money they've got. They would be already if the Linux community didn't have it's head so firmly up it's own command line buffer that it looks like taking 15 years to find the desktop. -- Matthew Faithfull

      E M 2 Replies Last reply
      0
      • M Member 96

        dan neely wrote:

        and people with a need for performance that can't be solved by just throwing hardware at it will pay extra for C++ with hand assembled hotloops

        I'm sure they will and they are likely not people in the market for consumer or business software. I can't conceive of any program that can't perform faster with faster hardware, care to enlighten us? Or are you saying it's already running on the biggest cluster on the planet and is still too slow, perhaps a weather simulator or something?


        When everyone is a hero no one is a hero.

        E Offline
        E Offline
        El Corazon
        wrote on last edited by
        #42

        John C wrote:

        and they are likely not people in the market for consumer

        very much not true! game market is driving harder and faster than military and business. If you aren't bending to gamers hardware, you are falling behind because hardware is shifting to gamers hard... so to speak. Business driving hardware is only as you said, the program is innefficient you throw larger iron at it until it works and then you leave it cooking for a year or two. Gamers upgrade regularly and create a lot of income. The hardware market is bending to gamers, the software market is leveraging the advantage that gamers send their way. Honest, if you are not taking advantage of one of the largest CONSUMER level software markets you may well be outdated in two generations of hardware. SLI is here because of gamers, shaders are here because of gamers, multi-core is here because of gamers, SATA is here because of gamers. We are driven by a HUGE consumer level software market, that is by far driven by efficient and filled with massive content. Even the military is wise enough to nod their heads and say, "that is a big market, how can we take advantage of it to our benefit." Simply saying it is old and worn out never makes it so.

        _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

        M 1 Reply Last reply
        0
        • C Chris Austin

          John C wrote:

          You must work in a different world than I do,.......

          Obviously. To me an my customers performance matters.

          John C wrote:

          did you really think I was advocating sloppy slow code development covered up with faster hardware?

          Yes. Anytime someone says something to the effect "of don't worry if it is slow, you can always upgrade your hardware" alarms go off in my mind.

          A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects. - -Lazarus Long

          M Offline
          M Offline
          Member 96
          wrote on last edited by
          #43

          *Acceptible* performance always matters it's a given, most customers would expect it, they don't expect to have to wait for anything and it's damned hard to write any software that makes a person wait, you have to have a hugely inept design. What they want to see are features and usability that meet their needs. If a programmer working for me spent all their time eking out milliseconds in the code and not concentrating on supporting the users expectations of how the software should work I'd fire their ass in a heartbeat. Bit fiddling like that in this modern age of super fast off the shelf bargain basement priced hardware is utterly meaningless. It was a huge consideration a decade or more ago it simply isn't as much of a factor any more. Very few if any seasoned developers would even start down a path that is blatantly unperformant. In a commercial software business your main goal is to make money, you do that with popular, easy to use, well supported software that has the *features* that people want and need. Performance is not a *feature* it's a given fundamental, it's like saying "but it must run on a modern computer". Hardware scalability is a feature of modern applications and database management systems, not a band aid.


          When everyone is a hero no one is a hero.

          C E 2 Replies Last reply
          0
          • M Marc Clifton

            John C wrote:

            Performance has never been an issue for me, it scales beautifully, where do you get this from?

            Perhaps there a difference between what is meant by "managed code" and "managed n-tier"? In any case, I've found the most woesome problems with serialization. The DataTable is incredibly bloated and the BinaryFormatter is not a true serializer, as it keeps everything in memory until the process is complete, and itself contributes to bloat because it doesn't actually result in binary data. Those two artifacts alone make major components of the .NET framework unscalable. Marc

            Thyme In The Country Interacx My Blog

            E Offline
            E Offline
            El Corazon
            wrote on last edited by
            #44

            Marc Clifton wrote:

            and the BinaryFormatter is not a true serializer, as it keeps everything in memory until the process is complete, and itself contributes to bloat because it doesn't actually result in binary data. Those two artifacts alone make major components of the .NET framework unscalable.

            check out the work by the motion picture standards board there isn't much here http://en.wikipedia.org/wiki/KLV[^] but the same disappointments with binary serialization have driven KLV standards back into the general marketplace. You can expect to see it taking over much of the streaming protocols on the internet soon.

            _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

            M 1 Reply Last reply
            0
            • M Marc Clifton

              John C wrote:

              Performance has never been an issue for me, it scales beautifully, where do you get this from?

              Perhaps there a difference between what is meant by "managed code" and "managed n-tier"? In any case, I've found the most woesome problems with serialization. The DataTable is incredibly bloated and the BinaryFormatter is not a true serializer, as it keeps everything in memory until the process is complete, and itself contributes to bloat because it doesn't actually result in binary data. Those two artifacts alone make major components of the .NET framework unscalable. Marc

              Thyme In The Country Interacx My Blog

              M Offline
              M Offline
              Member 96
              wrote on last edited by
              #45

              Hmm..I've never tried to serialize a DataTable, just my own business object classes. We have a completely managed n-tier design that we know scales very well from testing and real world use based on a modified version of Rocky Lhotka's business object framework that is pretty widely used. We support a remote dataportal configuration which serializes data between a user and a remote IIS server. Aside from the wire transfer overhead there is very little noticeable to the user difference in performance. I'll admit I've not tried it with some super high number of test users like 10,000 or something but from other users experience with the framework they've reported that it's not an issue with sufficient hardware.


              When everyone is a hero no one is a hero.

              1 Reply Last reply
              0
              • D Dan Neely

                Scientific computing is one of the big markets. All doubling the horse power would do for the team would be for them to either double the size or number of models run to keep it at 100% load. Writing in C++ and doing the hot loop in assembly is cheaper than buying a few hundred or a few thousand more blades for the cluster. Distributed computing projects don't even have the buy hardware option at all. Gaming is the other. Consoles have fixed hardware specs. While PCs don't most PC gamers are already running the fastest hardware they can justify buying. Again 'buy something faster' isn't an option. Similar arguments can apply to really large enterprisey systems. Most of the time though writing managed code and throwing a 2nd server at it is cheaper though.

                Otherwise [Microsoft is] toast in the long term no matter how much money they've got. They would be already if the Linux community didn't have it's head so firmly up it's own command line buffer that it looks like taking 15 years to find the desktop. -- Matthew Faithfull

                M Offline
                M Offline
                Member 96
                wrote on last edited by
                #46

                Sure those are all cycle hungry users, I thought we were talking about off the shelf kind of software.


                When everyone is a hero no one is a hero.

                E 1 Reply Last reply
                0
                • D Dan Neely

                  Scientific computing is one of the big markets. All doubling the horse power would do for the team would be for them to either double the size or number of models run to keep it at 100% load. Writing in C++ and doing the hot loop in assembly is cheaper than buying a few hundred or a few thousand more blades for the cluster. Distributed computing projects don't even have the buy hardware option at all. Gaming is the other. Consoles have fixed hardware specs. While PCs don't most PC gamers are already running the fastest hardware they can justify buying. Again 'buy something faster' isn't an option. Similar arguments can apply to really large enterprisey systems. Most of the time though writing managed code and throwing a 2nd server at it is cheaper though.

                  Otherwise [Microsoft is] toast in the long term no matter how much money they've got. They would be already if the Linux community didn't have it's head so firmly up it's own command line buffer that it looks like taking 15 years to find the desktop. -- Matthew Faithfull

                  E Offline
                  E Offline
                  El Corazon
                  wrote on last edited by
                  #47

                  dan neely wrote:

                  While PCs don't most PC gamers are already running the fastest hardware they can justify buying.

                  and many of them are overclocking that hardware to push it right to the breaking point. With 3.4 lb (1.5kg) and 3x140mm fans for air cooled to phase change to Thermo electric cooling, folks are pushing the 5Ghz boundary in the gaming market already even though no commercial place is gutsy enough to sell it, gamers are pushing the hardware over the line and past the commercial level, WELL beyond the commercial level... and they are a huge consumer market!!

                  _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                  D 1 Reply Last reply
                  0
                  • M Member 96

                    Sure those are all cycle hungry users, I thought we were talking about off the shelf kind of software.


                    When everyone is a hero no one is a hero.

                    E Offline
                    E Offline
                    El Corazon
                    wrote on last edited by
                    #48

                    John C wrote:

                    I thought we were talking about off the shelf kind of software.

                    well, you know john... you must be right ... I have to special order my copy of games... because walmart just won't stock games for cycle hungry users.... They ripped out the game section and replaced it with extra aisle of bath soap... go check yours and see if it is the same.... :rolleyes:

                    _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                    M 1 Reply Last reply
                    0
                    • E El Corazon

                      Marc Clifton wrote:

                      and the BinaryFormatter is not a true serializer, as it keeps everything in memory until the process is complete, and itself contributes to bloat because it doesn't actually result in binary data. Those two artifacts alone make major components of the .NET framework unscalable.

                      check out the work by the motion picture standards board there isn't much here http://en.wikipedia.org/wiki/KLV[^] but the same disappointments with binary serialization have driven KLV standards back into the general marketplace. You can expect to see it taking over much of the streaming protocols on the internet soon.

                      _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                      M Offline
                      M Offline
                      Marc Clifton
                      wrote on last edited by
                      #49

                      El Corazon wrote:

                      You can expect to see it taking over much of the streaming protocols on the internet soon.

                      Fine with me. :) The other interesting thing about KLV is that if you partition the KL from the V, you can get decent compression on the KL part. The KL part would need to be lossless, but the V part, especially for streaming audio/video, can then utilize a lossy compression algorithm. Of course, this requires more preprocessing on the front end because you have to first update all the L's after applying the compression before sending the KL packet. Marc

                      Thyme In The Country Interacx My Blog

                      E 1 Reply Last reply
                      0
                      • E El Corazon

                        John C wrote:

                        and they are likely not people in the market for consumer

                        very much not true! game market is driving harder and faster than military and business. If you aren't bending to gamers hardware, you are falling behind because hardware is shifting to gamers hard... so to speak. Business driving hardware is only as you said, the program is innefficient you throw larger iron at it until it works and then you leave it cooking for a year or two. Gamers upgrade regularly and create a lot of income. The hardware market is bending to gamers, the software market is leveraging the advantage that gamers send their way. Honest, if you are not taking advantage of one of the largest CONSUMER level software markets you may well be outdated in two generations of hardware. SLI is here because of gamers, shaders are here because of gamers, multi-core is here because of gamers, SATA is here because of gamers. We are driven by a HUGE consumer level software market, that is by far driven by efficient and filled with massive content. Even the military is wise enough to nod their heads and say, "that is a big market, how can we take advantage of it to our benefit." Simply saying it is old and worn out never makes it so.

                        _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                        M Offline
                        M Offline
                        Member 96
                        wrote on last edited by
                        #50

                        I don't play a lot of games or even thing about gaming software usually, you're right, but surely we are not far off from reaching a point where gaming software has evolved to the point that the only way for more performance is with more powerful hardware? When some software *isn't* hardware scaleable does't that mean the software hasn't evolved fully?


                        When everyone is a hero no one is a hero.

                        C E 2 Replies Last reply
                        0
                        • M Member 96

                          *Acceptible* performance always matters it's a given, most customers would expect it, they don't expect to have to wait for anything and it's damned hard to write any software that makes a person wait, you have to have a hugely inept design. What they want to see are features and usability that meet their needs. If a programmer working for me spent all their time eking out milliseconds in the code and not concentrating on supporting the users expectations of how the software should work I'd fire their ass in a heartbeat. Bit fiddling like that in this modern age of super fast off the shelf bargain basement priced hardware is utterly meaningless. It was a huge consideration a decade or more ago it simply isn't as much of a factor any more. Very few if any seasoned developers would even start down a path that is blatantly unperformant. In a commercial software business your main goal is to make money, you do that with popular, easy to use, well supported software that has the *features* that people want and need. Performance is not a *feature* it's a given fundamental, it's like saying "but it must run on a modern computer". Hardware scalability is a feature of modern applications and database management systems, not a band aid.


                          When everyone is a hero no one is a hero.

                          C Offline
                          C Offline
                          Chris Austin
                          wrote on last edited by
                          #51

                          John C wrote:

                          it's damned hard to write any software that makes a person wait, you have to have a hugely inept design.

                          You haven't used VS.net lately then have you. Otherwise I agree with most of your statements. But, we are worlds apart on how we feel about performance and relying on quality hardware.

                          A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects. - -Lazarus Long

                          M 1 Reply Last reply
                          0
                          • M Marc Clifton

                            El Corazon wrote:

                            You can expect to see it taking over much of the streaming protocols on the internet soon.

                            Fine with me. :) The other interesting thing about KLV is that if you partition the KL from the V, you can get decent compression on the KL part. The KL part would need to be lossless, but the V part, especially for streaming audio/video, can then utilize a lossy compression algorithm. Of course, this requires more preprocessing on the front end because you have to first update all the L's after applying the compression before sending the KL packet. Marc

                            Thyme In The Country Interacx My Blog

                            E Offline
                            E Offline
                            El Corazon
                            wrote on last edited by
                            #52

                            Marc Clifton wrote:

                            Fine with me. The other interesting thing about KLV is that if you partition the KL from the V, you can get decent compression on the KL part.

                            the other advantage is it is future friendly. If you don't understand the K, you use the L to skip the V. with XML you find a key you don't understand, you just keep reading until you find the next piece you know... KLV is a big hit in quite a few markets.

                            _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                            M 1 Reply Last reply
                            0
                            • M Member 96

                              *Acceptible* performance always matters it's a given, most customers would expect it, they don't expect to have to wait for anything and it's damned hard to write any software that makes a person wait, you have to have a hugely inept design. What they want to see are features and usability that meet their needs. If a programmer working for me spent all their time eking out milliseconds in the code and not concentrating on supporting the users expectations of how the software should work I'd fire their ass in a heartbeat. Bit fiddling like that in this modern age of super fast off the shelf bargain basement priced hardware is utterly meaningless. It was a huge consideration a decade or more ago it simply isn't as much of a factor any more. Very few if any seasoned developers would even start down a path that is blatantly unperformant. In a commercial software business your main goal is to make money, you do that with popular, easy to use, well supported software that has the *features* that people want and need. Performance is not a *feature* it's a given fundamental, it's like saying "but it must run on a modern computer". Hardware scalability is a feature of modern applications and database management systems, not a band aid.


                              When everyone is a hero no one is a hero.

                              E Offline
                              E Offline
                              El Corazon
                              wrote on last edited by
                              #53

                              John C wrote:

                              It was a huge consideration a decade or more ago it simply isn't as much of a factor any more.

                              only in your market. There is not a computer built today, or next year that will not be pushed to its limit. If we had 16 cores consumer right now, it would not be enough. It will never be enough because the market expands to fill the computer capability. There is ALWAYS more to do, it will NEVER end. Nothing will ever be fast enough. If you believe otherwise... see if you can run your stuff on a commodore Pet, it was advertised as the last computer you would ever need. :-D

                              _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                              M 1 Reply Last reply
                              0
                              • E El Corazon

                                Marc Clifton wrote:

                                Fine with me. The other interesting thing about KLV is that if you partition the KL from the V, you can get decent compression on the KL part.

                                the other advantage is it is future friendly. If you don't understand the K, you use the L to skip the V. with XML you find a key you don't understand, you just keep reading until you find the next piece you know... KLV is a big hit in quite a few markets.

                                _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                                M Offline
                                M Offline
                                Marc Clifton
                                wrote on last edited by
                                #54

                                El Corazon wrote:

                                If you don't understand the K, you use the L to skip the V.

                                Great point. Marc

                                Thyme In The Country Interacx My Blog

                                1 Reply Last reply
                                0
                                • M Member 96

                                  I don't play a lot of games or even thing about gaming software usually, you're right, but surely we are not far off from reaching a point where gaming software has evolved to the point that the only way for more performance is with more powerful hardware? When some software *isn't* hardware scaleable does't that mean the software hasn't evolved fully?


                                  When everyone is a hero no one is a hero.

                                  C Offline
                                  C Offline
                                  Chris Austin
                                  wrote on last edited by
                                  #55

                                  John C wrote:

                                  you're right, but surely we are not far off from reaching a point where gaming software has evolved to the point that the only way for more performance is with more powerful hardware?

                                  From the outside looking in it may appear that way. But, a majority of the game engines and pipeline tools still don't take full advantage of current hardware. There are a few like gamebyro and unreal that have pushed a bit at the envelope but in general I'd say there is room for improvement.

                                  A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects. - -Lazarus Long

                                  1 Reply Last reply
                                  0
                                  • E El Corazon

                                    John C wrote:

                                    I thought we were talking about off the shelf kind of software.

                                    well, you know john... you must be right ... I have to special order my copy of games... because walmart just won't stock games for cycle hungry users.... They ripped out the game section and replaced it with extra aisle of bath soap... go check yours and see if it is the same.... :rolleyes:

                                    _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                                    M Offline
                                    M Offline
                                    Member 96
                                    wrote on last edited by
                                    #56

                                    Non-gaming off the shelf software sorry if I was unclear. As I pointed out in another post here just now I don't really think about gaming software as I don't play them very often, I'm thinking about productivity software of one kind or another usually and I keep saying that's my perspective, we know yours, perhaps it would be useful if we had some kind of way of indicating who works in what field in our icons or something so that there would be more understanding of points of view. A lot of time debates flair up for no more good reason than that the two people work in different fields and have their perspectives from that experience.


                                    When everyone is a hero no one is a hero.

                                    E 1 Reply Last reply
                                    0
                                    • M Member 96

                                      I don't play a lot of games or even thing about gaming software usually, you're right, but surely we are not far off from reaching a point where gaming software has evolved to the point that the only way for more performance is with more powerful hardware? When some software *isn't* hardware scaleable does't that mean the software hasn't evolved fully?


                                      When everyone is a hero no one is a hero.

                                      E Offline
                                      E Offline
                                      El Corazon
                                      wrote on last edited by
                                      #57

                                      John C wrote:

                                      but surely we are not far off from reaching a point where gaming software has evolved to the point that the only way for more performance is with more powerful hardware?

                                      no, and we will never reach that level. Scientific computation overlaps with games to a larger amount than you realize. Physics engines grow more real. From suspension systems on cars, to accurate flight models in flight sims you are talking about mathematical problems that were supercomputer only a decade ago. As the computer evolves, the games evolve, better graphics, more realistic physics, higher AI, learning systems, adaptive logic and adaptic terrain (have you ever stopped to think about the physics involved in digging a hole in the ground?). There is a list a mile long for things that would like to be added to games. The list will only grow, even as hardware is capable of adding more from it, it will never, ever complete it. Not tomorrow, not 100 years from now. If we had holographic true 3D content, it would still not be enough. If we had projection of 3D graphics into the brain itself, it would still not be enough. It will never, ever be enough.

                                      _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                                      1 Reply Last reply
                                      0
                                      • C Chris Austin

                                        John C wrote:

                                        it's damned hard to write any software that makes a person wait, you have to have a hugely inept design.

                                        You haven't used VS.net lately then have you. Otherwise I agree with most of your statements. But, we are worlds apart on how we feel about performance and relying on quality hardware.

                                        A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects. - -Lazarus Long

                                        M Offline
                                        M Offline
                                        Member 96
                                        wrote on last edited by
                                        #58

                                        Chris Austin wrote:

                                        You haven't used VS.net lately then have you.

                                        Every day, it's pretty much unnoticeablly slow at any point since I upgraded my hardware to a quad core, 4gb ram, Vista and super fast RAID 0 SATA array. But I expect to need that hardware because there is a lot of file access going on that just can't be avoided and my main solution consists of 22 separate projects that comprise a .net app with a winform UI, two different asp.net ui's and several accounting integration and other utility add-on's.

                                        Chris Austin wrote:

                                        But, we are worlds apart on how we feel about performance and relying on quality hardware

                                        Why would that be? Surely you know there's a point beyond which you can't or it's unreasonable to optimize the software any more and the only thing left if more performance is required is more powerful hardware. In my world it's about scalability, our software is designed to scale from a single user on a basic pentium to a web farm with multiple sql servers and thousands of concurrent users, the same software exactly no different versions. I know a thing or two about performance, you have to in this scenario, but scalability is a good thing, it gives users options, it's not about relying on hardware to cover up sloppy programming. If an enterprise customer want's to run our software they just apply the appropriate hardware, the software is designed for that. Thats how I see it and why we have a different point of view perhaps on the situation.


                                        When everyone is a hero no one is a hero.

                                        C E 2 Replies Last reply
                                        0
                                        • E El Corazon

                                          John C wrote:

                                          It was a huge consideration a decade or more ago it simply isn't as much of a factor any more.

                                          only in your market. There is not a computer built today, or next year that will not be pushed to its limit. If we had 16 cores consumer right now, it would not be enough. It will never be enough because the market expands to fill the computer capability. There is ALWAYS more to do, it will NEVER end. Nothing will ever be fast enough. If you believe otherwise... see if you can run your stuff on a commodore Pet, it was advertised as the last computer you would ever need. :-D

                                          _________________________ Asu no koto o ieba, tenjo de nezumi ga warau. Talk about things of tomorrow and the mice in the ceiling laugh. (Japanese Proverb)

                                          M Offline
                                          M Offline
                                          Member 96
                                          wrote on last edited by
                                          #59

                                          El Corazon wrote:

                                          only in your market.

                                          Perhaps but "my" market is a pretty big one. Your market is an exception, an edge condition of sorts. Both of us make assumptions about things based on what we are involved in, the difference is my assumptions are a bit more widely applicable. ;)


                                          When everyone is a hero no one is a hero.

                                          E 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups