Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Database & SysAdmin
  3. Database
  4. Is it possible to update a table concurrently? (using SQL Server 2008)

Is it possible to update a table concurrently? (using SQL Server 2008)

Scheduled Pinned Locked Moved Database
helpdatabasesql-serversysadminquestion
12 Posts 5 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • A Ashfield

    Is it essential for these bulk updates to run as a single transaction? If not, then your solution could be to make these transactions smaller.

    Bob Ashfield Consultants Ltd Proud to be a 2009 Code Project MVP

    R Offline
    R Offline
    Rafferty Uy
    wrote on last edited by
    #3

    Hi Bob, Thanks for your reply. Is this the only way? The effort to do this is currently high, the table contains a trigger which will need to somehow be made "smaller" as well. Rafferty

    Rafferty

    A 1 Reply Last reply
    0
    • R Rafferty Uy

      Hi Bob, Thanks for your reply. Is this the only way? The effort to do this is currently high, the table contains a trigger which will need to somehow be made "smaller" as well. Rafferty

      Rafferty

      A Offline
      A Offline
      Ashfield
      wrote on last edited by
      #4

      Its the only way I could think of. Basically, any transaction will lock the table to other updates, so the only way is to use small transactions. I would be very reluctant to implement something that has a 30 minute transaction anyway - consider the imapct of a failure after 29 minutes. The transaction will have to rollback completely before any further updates can be done, and that would probably be close to another 30 minutes. Bite the bullet and do it right - it will pay off in the long run :)

      Bob Ashfield Consultants Ltd Proud to be a 2009 Code Project MVP

      R 1 Reply Last reply
      0
      • R Rafferty Uy

        Hi, When a large transaction is updating thousands of rows in a table, other transactions that wants to insert/update data into this table waits until the former transaction is finished. (Please verify if my understanding is incorrect). I'm saying this in the context of web application with multiple functions updating the same table, some of which are "bulk updates" which users sometimes use to update a large amount of data which takes a good 30 minutes to run. In this case, all other users are stuck and could not use functions that updates the same table. Is there a way to solve this problem (other than reducing the 30-min runtime)? Thanks!

        Rafferty

        T Offline
        T Offline
        T2102
        wrote on last edited by
        #5

        You may already be doing this or something more sophisticated... Depending on how you are loading the data and if you can check constraints/triggers without database access, you may be able to drop constraints and triggers temporarily. Then afterwards you reapply the constraints and triggers. This may be faster and also let you recover from certain errors as you would log only those rows that you could not load.

        1 Reply Last reply
        0
        • R Rafferty Uy

          Hi, When a large transaction is updating thousands of rows in a table, other transactions that wants to insert/update data into this table waits until the former transaction is finished. (Please verify if my understanding is incorrect). I'm saying this in the context of web application with multiple functions updating the same table, some of which are "bulk updates" which users sometimes use to update a large amount of data which takes a good 30 minutes to run. In this case, all other users are stuck and could not use functions that updates the same table. Is there a way to solve this problem (other than reducing the 30-min runtime)? Thanks!

          Rafferty

          D Offline
          D Offline
          David Mujica
          wrote on last edited by
          #6

          Consider this, Create an indicator somehow that could be interrogated prior to making a change to the table. If the "bulk update" indicator is true then maybe you could warn the user that the database is currently being updated and they want to try their update at a later time. Or, if you want to get sophiticated, maybe you could create a pending transaction table where you could temporarily store their transaction until the "bulk update" is finished and apply it when done. Basically, you would be creating a Queue for updates on your table. Look at sp_GetAppLock as a possible tool for creating your "bulk update" inidcator. Just a thought.

          1 Reply Last reply
          0
          • R Rafferty Uy

            Hi, When a large transaction is updating thousands of rows in a table, other transactions that wants to insert/update data into this table waits until the former transaction is finished. (Please verify if my understanding is incorrect). I'm saying this in the context of web application with multiple functions updating the same table, some of which are "bulk updates" which users sometimes use to update a large amount of data which takes a good 30 minutes to run. In this case, all other users are stuck and could not use functions that updates the same table. Is there a way to solve this problem (other than reducing the 30-min runtime)? Thanks!

            Rafferty

            T Offline
            T Offline
            The Man from U N C L E
            wrote on last edited by
            #7

            All depends how optimistic you want to be. If you perform the updates using the (RowLock) hint it may restrict the locking. Though a half hour transaction will still end up locking everything in sight until it is done, The only other thing would be to run your selects using the (NoLock) hint, with the corresponding risk that you will read some updated data that will later get rolled back and so have never existed. What is more important, stopping the locking, or getting guaranteed data? Your choice. Having said that Unless I have guaranteed single user access to a database, such as in an overnight job, there is no way I would allow any transaction to lock records for more than a few milliseconds. In my experience, if the Application hangs for half an hour waiting for database access the end users will buy a new system.

            If you have knowledge, let others light their candles at it. Margaret Fuller (1810 - 1850) [My Articles]  [My Website]

            1 Reply Last reply
            0
            • R Rafferty Uy

              Hi, When a large transaction is updating thousands of rows in a table, other transactions that wants to insert/update data into this table waits until the former transaction is finished. (Please verify if my understanding is incorrect). I'm saying this in the context of web application with multiple functions updating the same table, some of which are "bulk updates" which users sometimes use to update a large amount of data which takes a good 30 minutes to run. In this case, all other users are stuck and could not use functions that updates the same table. Is there a way to solve this problem (other than reducing the 30-min runtime)? Thanks!

              Rafferty

              R Offline
              R Offline
              Rafferty Uy
              wrote on last edited by
              #8

              Thanks all for your ideas. Looks like the best way is really to reduce this bulk update into smaller transactions. Let's see if our client agrees with the change. Thanks again! You're all very helpful! :)

              Rafferty

              1 Reply Last reply
              0
              • A Ashfield

                Its the only way I could think of. Basically, any transaction will lock the table to other updates, so the only way is to use small transactions. I would be very reluctant to implement something that has a 30 minute transaction anyway - consider the imapct of a failure after 29 minutes. The transaction will have to rollback completely before any further updates can be done, and that would probably be close to another 30 minutes. Bite the bullet and do it right - it will pay off in the long run :)

                Bob Ashfield Consultants Ltd Proud to be a 2009 Code Project MVP

                R Offline
                R Offline
                Rafferty Uy
                wrote on last edited by
                #9

                Hi Bob, I'm trying to do this by implementing the smaller transactions in C#. However, I realize that I still encapsulated everything in transaction.BeginTransaction() and transaction.Commit() statements. My question is, does this do the trick? Or is it better to do some sort of "checking" at the beginning first (which checks for possible errors) and then do each update in separate transactions? This is tricky because the the function that I'm working on updates a tree of objects (parent-child relationships) which may contain thousands of nodes. Thanks again for your help.

                Rafferty

                A 1 Reply Last reply
                0
                • R Rafferty Uy

                  Hi Bob, I'm trying to do this by implementing the smaller transactions in C#. However, I realize that I still encapsulated everything in transaction.BeginTransaction() and transaction.Commit() statements. My question is, does this do the trick? Or is it better to do some sort of "checking" at the beginning first (which checks for possible errors) and then do each update in separate transactions? This is tricky because the the function that I'm working on updates a tree of objects (parent-child relationships) which may contain thousands of nodes. Thanks again for your help.

                  Rafferty

                  A Offline
                  A Offline
                  Ashfield
                  wrote on last edited by
                  #10

                  Its hard to say without knowing more about your data structure, but if you have a tree of objects to update then each parent item needs to be a transaction, so it all works or all fails and rolls back. Think of it in terms of atomic transactions i.e. each transaction is a single unit of work, no matter how many tables are impacted.

                  Bob Ashfield Consultants Ltd Proud to be a 2009 Code Project MVP

                  R 1 Reply Last reply
                  0
                  • A Ashfield

                    Its hard to say without knowing more about your data structure, but if you have a tree of objects to update then each parent item needs to be a transaction, so it all works or all fails and rolls back. Think of it in terms of atomic transactions i.e. each transaction is a single unit of work, no matter how many tables are impacted.

                    Bob Ashfield Consultants Ltd Proud to be a 2009 Code Project MVP

                    R Offline
                    R Offline
                    Rafferty Uy
                    wrote on last edited by
                    #11

                    I see. Basically my test data contains a single parent with children, grand children, great grand children (up to the 6th level) totaling 1200 nodes. 2 tables are affected: The main table where the data is contained and another table which just contains the parent/child ids relationships. Only the main table is updated. The previous code generates the tree, traverses and updates each node using a stored procedure. I now just moved the tree generation and traversal to C# code and keep a simple stored procedure to update each data. Then I enclosed the entire thing in a single transaction (using the SqlTransaction class). Any comments about this? My concern is that, since it is a single transaction... I may still experience the same problem that we currently have, which is that when the data is updating, other users who are updating data in the same SQL table are stuck until this finishes updating. Thanks again.

                    Rafferty

                    1 Reply Last reply
                    0
                    • R Rafferty Uy

                      Hi, When a large transaction is updating thousands of rows in a table, other transactions that wants to insert/update data into this table waits until the former transaction is finished. (Please verify if my understanding is incorrect). I'm saying this in the context of web application with multiple functions updating the same table, some of which are "bulk updates" which users sometimes use to update a large amount of data which takes a good 30 minutes to run. In this case, all other users are stuck and could not use functions that updates the same table. Is there a way to solve this problem (other than reducing the 30-min runtime)? Thanks!

                      Rafferty

                      R Offline
                      R Offline
                      Rafferty Uy
                      wrote on last edited by
                      #12

                      Hello again, I converted the large stored procedure into smaller ones, by placing the tree generation and traversal business logic in C# code and just using stored procedures for simple checks and updates. The entire operation, then, is enclosed in a single transaction. The good news is that the 30-minute operation was optimized to 15 minutes. The bad news is that the new c# code made the web server too busy that other users couldn't use the application (even though multi-threading is automatically done by the IIS). What I plan to do is to implement multi-threading manually for this function. Do you think it will solve my problem? Any other suggestions? Thanks again.

                      Rafferty

                      1 Reply Last reply
                      0
                      Reply
                      • Reply as topic
                      Log in to reply
                      • Oldest to Newest
                      • Newest to Oldest
                      • Most Votes


                      • Login

                      • Don't have an account? Register

                      • Login or register to search.
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • World
                      • Users
                      • Groups