Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. Performance in C# application

Performance in C# application

Scheduled Pinned Locked Moved C#
databasecsharpperformancesql-serversysadmin
7 Posts 2 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • S Offline
    S Offline
    sergiq2
    wrote on last edited by
    #1

    Hello, I write apllication(.NET 3.5 C#, SQL Server 2k8) (type ERP) with modules like: HRM, SCM, FRM, MRP. It is multi user application. I decide store all important data from DB in Dictionaries or List ex. for INVOICE i create Dictionary<int,> etc. To reduce connection with data base I have mechanism base on events to update specific data in my Dictionaries. I chose this solution because I don't want once again fill my GridView from DataBase when one record has been changed by another user. To synchronize my Dictionaries I want to use Query Notification and tracking changes mechanism (SQL Server 2008). And all work fine but .... Hypothetical situation. I have for example 20 Dictionaries and any of this Dictionary have 2000-5000 elements.I fill it when the user logs on to an application. How does it affect the weight of memory? Is that way - casched Dictionaries - is a better solution then load all data (ex.Invoices) from data base when i need use it ? Sorry for my English. It is not perfect.

    J 1 Reply Last reply
    0
    • S sergiq2

      Hello, I write apllication(.NET 3.5 C#, SQL Server 2k8) (type ERP) with modules like: HRM, SCM, FRM, MRP. It is multi user application. I decide store all important data from DB in Dictionaries or List ex. for INVOICE i create Dictionary<int,> etc. To reduce connection with data base I have mechanism base on events to update specific data in my Dictionaries. I chose this solution because I don't want once again fill my GridView from DataBase when one record has been changed by another user. To synchronize my Dictionaries I want to use Query Notification and tracking changes mechanism (SQL Server 2008). And all work fine but .... Hypothetical situation. I have for example 20 Dictionaries and any of this Dictionary have 2000-5000 elements.I fill it when the user logs on to an application. How does it affect the weight of memory? Is that way - casched Dictionaries - is a better solution then load all data (ex.Invoices) from data base when i need use it ? Sorry for my English. It is not perfect.

      J Offline
      J Offline
      Jon Rista
      wrote on last edited by
      #2

      Such extensive caching is usually not required unless you have some tremendous load on your servers. Generally speaking, you should avoid caching as muc has possible because it adds an additional level of complexity that must be managed in addition to managing the data in the database. You greatly increase the risk of data corruption but caching everything all the time, because it is now your responsibility to make absolutely certain data integrity constraints are met, rather than letting an RDBMS do it for you. Not only that, you will indeed greatly increase your memory footprint, and that footprint will grow as the usage of your application grows to the point where you can't keep everything in memory at all times. If you already have performance problems...then there are other ways to solve them...such as scaling out your hardware (better hardware, more servers, etc.) If you do not have performance problems yet, and are trying to preemptively solve possible performance problems...don't bother. It is difficult to predict what may cause performance problems, and caching isn't always the best solution, and should generally not be the first. You can gain much more in terms of performance improvements for less cost by adding hardware than by increasing the complexity of your code.

      S 1 Reply Last reply
      0
      • J Jon Rista

        Such extensive caching is usually not required unless you have some tremendous load on your servers. Generally speaking, you should avoid caching as muc has possible because it adds an additional level of complexity that must be managed in addition to managing the data in the database. You greatly increase the risk of data corruption but caching everything all the time, because it is now your responsibility to make absolutely certain data integrity constraints are met, rather than letting an RDBMS do it for you. Not only that, you will indeed greatly increase your memory footprint, and that footprint will grow as the usage of your application grows to the point where you can't keep everything in memory at all times. If you already have performance problems...then there are other ways to solve them...such as scaling out your hardware (better hardware, more servers, etc.) If you do not have performance problems yet, and are trying to preemptively solve possible performance problems...don't bother. It is difficult to predict what may cause performance problems, and caching isn't always the best solution, and should generally not be the first. You can gain much more in terms of performance improvements for less cost by adding hardware than by increasing the complexity of your code.

        S Offline
        S Offline
        sergiq2
        wrote on last edited by
        #3

        Thank you for your answere. Unfortunetlly I have this mechanism in my application and I'm done 2 modules. It is hard to change everythig in this moment. But... is there a big problem if in my whole application with all modules I will be have maximum 100 000 rows in my whole tables in Data Base ? I don't know that 100 000 records whether much, and how much it affects the load on the system ? Because I want to sell this application with computer (notebook), my minimum requiments is: - Core 2 Duo 2GHz, - minimum 2GB RAM.

        J 1 Reply Last reply
        0
        • S sergiq2

          Thank you for your answere. Unfortunetlly I have this mechanism in my application and I'm done 2 modules. It is hard to change everythig in this moment. But... is there a big problem if in my whole application with all modules I will be have maximum 100 000 rows in my whole tables in Data Base ? I don't know that 100 000 records whether much, and how much it affects the load on the system ? Because I want to sell this application with computer (notebook), my minimum requiments is: - Core 2 Duo 2GHz, - minimum 2GB RAM.

          J Offline
          J Offline
          Jon Rista
          wrote on last edited by
          #4

          I can't say exactly what the memory load would be. It entirely depends on the size of your records. If they are all maximum size (aprox 8000 bytes, barring varchar(max) columns), and factoring in .NET overhead, 100000 records will use approximately a gig of memory. That is JUST for the rows...that doesn't factor in any other memory used by your application. Assuming you want this application to work on 32bit systems, then you will really be pushing it, as each app in 32bit windows gets a maximum of 2Gb addressable memory space. That aside, there are much deeper concerns than memory consumption here. You are trying to cache ALL of your data in memory. Its not as simple as sticking all your records in a collection. There are usually relationships between data, concurrent updates, general data integrity, etc. to worry about when you cache that much data. Despite the fact that you have already started development, I think you need to take a step back and really evaluate what caching all your data means to the long-term sustainability of your project. You may think that continuing on will save time and money...but you are just as likely to introduce some very complex scenarios due to your caching that will need to be addressed down the road, and that could cause your long-term costs to explode far beyond any short-term costs of reevaluating your approach.

          S 1 Reply Last reply
          0
          • J Jon Rista

            I can't say exactly what the memory load would be. It entirely depends on the size of your records. If they are all maximum size (aprox 8000 bytes, barring varchar(max) columns), and factoring in .NET overhead, 100000 records will use approximately a gig of memory. That is JUST for the rows...that doesn't factor in any other memory used by your application. Assuming you want this application to work on 32bit systems, then you will really be pushing it, as each app in 32bit windows gets a maximum of 2Gb addressable memory space. That aside, there are much deeper concerns than memory consumption here. You are trying to cache ALL of your data in memory. Its not as simple as sticking all your records in a collection. There are usually relationships between data, concurrent updates, general data integrity, etc. to worry about when you cache that much data. Despite the fact that you have already started development, I think you need to take a step back and really evaluate what caching all your data means to the long-term sustainability of your project. You may think that continuing on will save time and money...but you are just as likely to introduce some very complex scenarios due to your caching that will need to be addressed down the road, and that could cause your long-term costs to explode far beyond any short-term costs of reevaluating your approach.

            S Offline
            S Offline
            sergiq2
            wrote on last edited by
            #5

            Thx for you answer. Because I have 2 modules finshed from 4, I'll change the way of work with database during creation 2 new modules. In time I'll try to change everything, without simple dictionaries like for example types of invoices or types of education level. What you suggesting as an alternative ?? I thought about : - strong typed DataSet's, - LINQ to SQL (but I read that is more slow then classic ADO.NET), - Entity Framework, - ADO.NET (mode non-connection), - or something else ?? I'd like to just keep communicating with the database using stored procedures. Overall, I am not afraid of new challenges and if I need I'll learn new technologies. Thank you once again

            J 1 Reply Last reply
            0
            • S sergiq2

              Thx for you answer. Because I have 2 modules finshed from 4, I'll change the way of work with database during creation 2 new modules. In time I'll try to change everything, without simple dictionaries like for example types of invoices or types of education level. What you suggesting as an alternative ?? I thought about : - strong typed DataSet's, - LINQ to SQL (but I read that is more slow then classic ADO.NET), - Entity Framework, - ADO.NET (mode non-connection), - or something else ?? I'd like to just keep communicating with the database using stored procedures. Overall, I am not afraid of new challenges and if I need I'll learn new technologies. Thank you once again

              J Offline
              J Offline
              Jon Rista
              wrote on last edited by
              #6

              I would recommend LINQ to SQL. You can build a conceptual model with L2S, then add stored procedures that return objects from your model. You are still able to use SPs, but you are still able to strongly typed objects rather than data sets. L2S is not slow if you use it for what it is...an ORM. The SQL generated by L2S is actually very efficient. If you use it with procs, you won't really see any of the performance benefits...but, neither will you run into anything that could cause performance problems either (i.e. working with huge object graphs and their changes, which can get kind of hairy). Entity Framework is premature. It has potential, but its up to Microsoft to realize that potential. Currently, EF is very intrusive and heavy. It will work great for non-distributed apps where the clien app is not separated from its business by web services or remoting. However, if there are web services separating your presentation from your domain, then EF is a real disaster. There is another free ORM, called nHibernate. It came from the Java world, so it doesn't fit well with Microsoft standards, but it is one of the better ORM's out there. It still has some of the problems that EF does as it is a bit intrusive, however it does generally support POCO and PI, so its currently a better choice than EF if you want a real ORM. Since you are currently looking to use stored procs, I would defintly go with L2S. Its the simplest solution that will get you the quickest results with procs.

              S 1 Reply Last reply
              0
              • J Jon Rista

                I would recommend LINQ to SQL. You can build a conceptual model with L2S, then add stored procedures that return objects from your model. You are still able to use SPs, but you are still able to strongly typed objects rather than data sets. L2S is not slow if you use it for what it is...an ORM. The SQL generated by L2S is actually very efficient. If you use it with procs, you won't really see any of the performance benefits...but, neither will you run into anything that could cause performance problems either (i.e. working with huge object graphs and their changes, which can get kind of hairy). Entity Framework is premature. It has potential, but its up to Microsoft to realize that potential. Currently, EF is very intrusive and heavy. It will work great for non-distributed apps where the clien app is not separated from its business by web services or remoting. However, if there are web services separating your presentation from your domain, then EF is a real disaster. There is another free ORM, called nHibernate. It came from the Java world, so it doesn't fit well with Microsoft standards, but it is one of the better ORM's out there. It still has some of the problems that EF does as it is a bit intrusive, however it does generally support POCO and PI, so its currently a better choice than EF if you want a real ORM. Since you are currently looking to use stored procs, I would defintly go with L2S. Its the simplest solution that will get you the quickest results with procs.

                S Offline
                S Offline
                sergiq2
                wrote on last edited by
                #7

                Ok, thx for your advice. I'll try to learn LINQ to SQL.

                1 Reply Last reply
                0
                Reply
                • Reply as topic
                Log in to reply
                • Oldest to Newest
                • Newest to Oldest
                • Most Votes


                • Login

                • Don't have an account? Register

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • World
                • Users
                • Groups