Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Database & SysAdmin
  3. Database
  4. Extreme performance issues (SQL Server 2000/ADO.NET/C#)

Extreme performance issues (SQL Server 2000/ADO.NET/C#)

Scheduled Pinned Locked Moved Database
databasecsharpsysadminsql-serverperformance
6 Posts 2 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • J Offline
    J Offline
    Jon Rista
    wrote on last edited by
    #1

    I'm using ADO.NET in a windows service application to perform a process on SQL Server 2000. This process runs very quickly if run through Query Analyser or Enterprise Manager, but takes an excessively long time when run through my application. To be more precise, executing stored procedures and views through Query Analyser take between 10 and 20 seconds to complete. The same exact stored procedures and views, run in the same exact order, through my program, take anywhere from 30 minutes to 2 hours to complete, and the system that runs SQL Server (a 4-cpu Xeons system with 2gigs of physical ram) is pegged at 25% cpu usage (the query uses 100% of a single cpu's worth of processing power). I am at a complete loss as to why such a vast difference in execution time would occurr, but here are some details. The windows service executes on a workstation. SQL Server 2000 executes on a server different from the workstation through a 100mbps ethernet network. Query Analyser/Enterprise Manager run on the same workstation as the windows service. The process is as follows: 1) Run a stored procedure to clear temp tables. 2) Import raw text data into a SQL Server table (Reconciliation). 3) Import data from a Microsoft Access database into 3 SQL Server tables (Accounts, HistoricalPayments, CurrentPayments). (This takes about 10 - 15minutes to import 70,000 - 100,000 records from an access database, housed on a network share on a different server.) 4) "Bucketize" the imported data. This process gathers data from the 4 tables stated so far (Reconciliation, Accounts, HistoricalPayments, CurrentPayments, and places records into another table (Buckets) and assigned a primary category number to each record through a stored procedure. 5) Sort buckets of data into subcategories, updating each record in (Buckets) and assigning a sub category number, through another stored procedure. 6) Retrieve a summary of the data in (Buckets) (this summary is a count of rows and summation of monetary values), grouped by the primary category number. This is a view. 7) Retrieve a summary of the data in (Buckets), grouped by both the primary and sub category numbers. This is a view. When I execute these steps manually through query analyser, (save step 3), each query takes anywhere from 1 second to 20 seconds. The views, surprisingly, take more time than the fairly complex stored procedures of step 4 and 5. When I execute these steps automatically using my windows service (written in .NET, C#, using AD

    S 1 Reply Last reply
    0
    • J Jon Rista

      I'm using ADO.NET in a windows service application to perform a process on SQL Server 2000. This process runs very quickly if run through Query Analyser or Enterprise Manager, but takes an excessively long time when run through my application. To be more precise, executing stored procedures and views through Query Analyser take between 10 and 20 seconds to complete. The same exact stored procedures and views, run in the same exact order, through my program, take anywhere from 30 minutes to 2 hours to complete, and the system that runs SQL Server (a 4-cpu Xeons system with 2gigs of physical ram) is pegged at 25% cpu usage (the query uses 100% of a single cpu's worth of processing power). I am at a complete loss as to why such a vast difference in execution time would occurr, but here are some details. The windows service executes on a workstation. SQL Server 2000 executes on a server different from the workstation through a 100mbps ethernet network. Query Analyser/Enterprise Manager run on the same workstation as the windows service. The process is as follows: 1) Run a stored procedure to clear temp tables. 2) Import raw text data into a SQL Server table (Reconciliation). 3) Import data from a Microsoft Access database into 3 SQL Server tables (Accounts, HistoricalPayments, CurrentPayments). (This takes about 10 - 15minutes to import 70,000 - 100,000 records from an access database, housed on a network share on a different server.) 4) "Bucketize" the imported data. This process gathers data from the 4 tables stated so far (Reconciliation, Accounts, HistoricalPayments, CurrentPayments, and places records into another table (Buckets) and assigned a primary category number to each record through a stored procedure. 5) Sort buckets of data into subcategories, updating each record in (Buckets) and assigning a sub category number, through another stored procedure. 6) Retrieve a summary of the data in (Buckets) (this summary is a count of rows and summation of monetary values), grouped by the primary category number. This is a view. 7) Retrieve a summary of the data in (Buckets), grouped by both the primary and sub category numbers. This is a view. When I execute these steps manually through query analyser, (save step 3), each query takes anywhere from 1 second to 20 seconds. The views, surprisingly, take more time than the fairly complex stored procedures of step 4 and 5. When I execute these steps automatically using my windows service (written in .NET, C#, using AD

      S Offline
      S Offline
      Steven Campbell
      wrote on last edited by
      #2

      Some thoughts/questions, in random order: 1) Are you execuring the stored procedures synchronousy? 2) Are you using ADO.NET transactions? (don't) 3) Is your service running at a low priority (use task manager to view base priority) 4) Have you installed the latest service pack for whatever version of .NET you are using? 5) What network library are you using to connect? 6) In the SQL Server Client Network Utility, is the "Enable Shared Memory Protocol" checkbox checked? (Try unchecking it) 7) Which method are you using to execute (best would be ExecuteNonQuery) 8) Try placing SET NOCOUNT ON as the first line of your stored procedures See also this MSDN checklist.


      my blog

      S J 2 Replies Last reply
      0
      • S Steven Campbell

        Some thoughts/questions, in random order: 1) Are you execuring the stored procedures synchronousy? 2) Are you using ADO.NET transactions? (don't) 3) Is your service running at a low priority (use task manager to view base priority) 4) Have you installed the latest service pack for whatever version of .NET you are using? 5) What network library are you using to connect? 6) In the SQL Server Client Network Utility, is the "Enable Shared Memory Protocol" checkbox checked? (Try unchecking it) 7) Which method are you using to execute (best would be ExecuteNonQuery) 8) Try placing SET NOCOUNT ON as the first line of your stored procedures See also this MSDN checklist.


        my blog

        S Offline
        S Offline
        Steven Campbell
        wrote on last edited by
        #3

        (8) is probably your best bet (SET NOCOUNT ON)


        my blog

        1 Reply Last reply
        0
        • S Steven Campbell

          Some thoughts/questions, in random order: 1) Are you execuring the stored procedures synchronousy? 2) Are you using ADO.NET transactions? (don't) 3) Is your service running at a low priority (use task manager to view base priority) 4) Have you installed the latest service pack for whatever version of .NET you are using? 5) What network library are you using to connect? 6) In the SQL Server Client Network Utility, is the "Enable Shared Memory Protocol" checkbox checked? (Try unchecking it) 7) Which method are you using to execute (best would be ExecuteNonQuery) 8) Try placing SET NOCOUNT ON as the first line of your stored procedures See also this MSDN checklist.


          my blog

          J Offline
          J Offline
          Jon Rista
          wrote on last edited by
          #4

          Thanks for the questions. First, let me answer them: 1) I'm not sure what you mean by "synchronously". I just call sqlCommand.ExecuteNonQuery() after setting up the parameters. 2) Yes, I was. I eliminated them after adding a temp sp that I ran from Query Analyser. I added a SQL transaction around calls to all my other stored procedures, and processing time jumped a ton. I let it run for just over a half hour then killed it. I think transactions may have been the problem, or at least a large part of it. I don't know why, but my queries, which are few but complex, may just be too much to transact. 3) SQL Server is running at a normal priority (priority level 7). 4) All of .NET is fully updated to the latest. We actually have a few versions, but the code has a .exe.config specifying which version to use. 5) I'm connecting to SQL Server using TCP/IP. 6) Shared memory protocol is disabled. 7) I'm executing with ExecuteNonQuery(). 8) I have not been using SET NOCOUNT ON. I forgot all about that, but I'll try adding it in a moment and see what happens. Now, to my questions. What exactly happens when you run queries through a transaction? Do the changes made get marked in a certain way, or is there some more extensive processing done to properly transact a set of queries/procedure calls? When I ran each of these stord procedures one by one, manually in QA, each one took between 1 and 20 seconds to run. The whole process couldn't have even taken a minute, even including the time for me to type the commands. Is it really possible that enabling transactions could induce a performance hit of at least 240%, possibly more? What kind of difference is there in performance between using true SQL Server transactions, and ADO.NET SqlTransaction? I've used transactions, and ADO.NET SqlTransactions, in the past. I've never run into such performance issues, but then again, I've never transacted a set of queries as complex as these. There arn't many of them, maybe 20 total, but they are all fairly complicated queries with lots of summations, a couple subqueries, and an average of 3 joins per query (except for DELETE's). Maybe some resources on SQL Server transactions, tips on improving their performance, proper ordering of queries and proper query structuring to maximize performance might help. I've been searching the net, but havn't found anything truely helpful yet. Thanks for the tips so far. ;)

          S 1 Reply Last reply
          0
          • J Jon Rista

            Thanks for the questions. First, let me answer them: 1) I'm not sure what you mean by "synchronously". I just call sqlCommand.ExecuteNonQuery() after setting up the parameters. 2) Yes, I was. I eliminated them after adding a temp sp that I ran from Query Analyser. I added a SQL transaction around calls to all my other stored procedures, and processing time jumped a ton. I let it run for just over a half hour then killed it. I think transactions may have been the problem, or at least a large part of it. I don't know why, but my queries, which are few but complex, may just be too much to transact. 3) SQL Server is running at a normal priority (priority level 7). 4) All of .NET is fully updated to the latest. We actually have a few versions, but the code has a .exe.config specifying which version to use. 5) I'm connecting to SQL Server using TCP/IP. 6) Shared memory protocol is disabled. 7) I'm executing with ExecuteNonQuery(). 8) I have not been using SET NOCOUNT ON. I forgot all about that, but I'll try adding it in a moment and see what happens. Now, to my questions. What exactly happens when you run queries through a transaction? Do the changes made get marked in a certain way, or is there some more extensive processing done to properly transact a set of queries/procedure calls? When I ran each of these stord procedures one by one, manually in QA, each one took between 1 and 20 seconds to run. The whole process couldn't have even taken a minute, even including the time for me to type the commands. Is it really possible that enabling transactions could induce a performance hit of at least 240%, possibly more? What kind of difference is there in performance between using true SQL Server transactions, and ADO.NET SqlTransaction? I've used transactions, and ADO.NET SqlTransactions, in the past. I've never run into such performance issues, but then again, I've never transacted a set of queries as complex as these. There arn't many of them, maybe 20 total, but they are all fairly complicated queries with lots of summations, a couple subqueries, and an average of 3 joins per query (except for DELETE's). Maybe some resources on SQL Server transactions, tips on improving their performance, proper ordering of queries and proper query structuring to maximize performance might help. I've been searching the net, but havn't found anything truely helpful yet. Thanks for the tips so far. ;)

            S Offline
            S Offline
            Steven Campbell
            wrote on last edited by
            #5

            Usually, if anything, transactions will speed things up. Everything in SQL is a transaction, so it is only the size that differs when you make "large" transactions. The reason I asked about ADO.NET transactions, is that sometimes it is best to just stick to one type. So, if you are using SQL transactions, then just use that. If you are using ADO.NET transactions, then try and avoid SQL transactions. Deadlocks can also be a problem when using transactions. Usually though, these will come back as errors. While your queries are running, you can run the command sp_who2 in Query analyzer, to see if any queries are being blocked. This can either be a deadlock situation, or just some other process that is also trying to do stuff. I don't know how much it would help in this scenario, but for optimizing queries, my best tip is to use the "optimizer hints". For example, DELETE MyTable (TABLOCKX) WHERE X = 1. The TABLOCKX hint tells the query to make a single exclusive lock on the table, rather than using the default page-style locking. In certain queries, this can improve performance tremendously. (It applies to all types of queries, not just deletes).


            my blog

            J 1 Reply Last reply
            0
            • S Steven Campbell

              Usually, if anything, transactions will speed things up. Everything in SQL is a transaction, so it is only the size that differs when you make "large" transactions. The reason I asked about ADO.NET transactions, is that sometimes it is best to just stick to one type. So, if you are using SQL transactions, then just use that. If you are using ADO.NET transactions, then try and avoid SQL transactions. Deadlocks can also be a problem when using transactions. Usually though, these will come back as errors. While your queries are running, you can run the command sp_who2 in Query analyzer, to see if any queries are being blocked. This can either be a deadlock situation, or just some other process that is also trying to do stuff. I don't know how much it would help in this scenario, but for optimizing queries, my best tip is to use the "optimizer hints". For example, DELETE MyTable (TABLOCKX) WHERE X = 1. The TABLOCKX hint tells the query to make a single exclusive lock on the table, rather than using the default page-style locking. In certain queries, this can improve performance tremendously. (It applies to all types of queries, not just deletes).


              my blog

              J Offline
              J Offline
              Jon Rista
              wrote on last edited by
              #6

              Hmm....I think I am more confused about the situation now than before. Why would adding a transaction to the process in QA increase the processing time by so much? Originally it took a minute, after adding the transaction it took much longer. Anyway, I'm adding SET NOCOUNT ON in each sp. I'll play with table locking and see how it helps. I don't think I've run into any deadlock situations, since the queries are all run sequentially, one after the other, and there are currently no other processes happening on the server.

              1 Reply Last reply
              0
              Reply
              • Reply as topic
              Log in to reply
              • Oldest to Newest
              • Newest to Oldest
              • Most Votes


              • Login

              • Don't have an account? Register

              • Login or register to search.
              • First post
                Last post
              0
              • Categories
              • Recent
              • Tags
              • Popular
              • World
              • Users
              • Groups