Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. Multi-threading Efficiency

Multi-threading Efficiency

Scheduled Pinned Locked Moved C#
questiondiscussion
5 Posts 4 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • M Offline
    M Offline
    MAW30
    wrote on last edited by
    #1

    Can anyone please tell me what would be the most efficient way to calculate data. I have to calculate 10000 items with 1000 algorithms each. 1. Have 10 different groups of data 100 algorithms each in various parts of my program, where it locks and unlocks the DataTable when storing. or 2. Calculate 1000 algorithms all at once, locking and unlocking the DataTable only once. Because 2 is a larger set of data to calculate other processes might have to wait more. However 1 locks and unlocks many more times slowing down the process. What is considered the best practice or rule to follow. Thanks in advance, Michael

    M P 2 Replies Last reply
    0
    • M MAW30

      Can anyone please tell me what would be the most efficient way to calculate data. I have to calculate 10000 items with 1000 algorithms each. 1. Have 10 different groups of data 100 algorithms each in various parts of my program, where it locks and unlocks the DataTable when storing. or 2. Calculate 1000 algorithms all at once, locking and unlocking the DataTable only once. Because 2 is a larger set of data to calculate other processes might have to wait more. However 1 locks and unlocks many more times slowing down the process. What is considered the best practice or rule to follow. Thanks in advance, Michael

      M Offline
      M Offline
      Mycroft Holmes
      wrote on last edited by
      #2

      Given a choice I would do this in a stored procedure in the database.

      Never underestimate the power of human stupidity RAH

      M 1 Reply Last reply
      0
      • M Mycroft Holmes

        Given a choice I would do this in a stored procedure in the database.

        Never underestimate the power of human stupidity RAH

        M Offline
        M Offline
        MAW30
        wrote on last edited by
        #3

        That's not an option.

        L 1 Reply Last reply
        0
        • M MAW30

          Can anyone please tell me what would be the most efficient way to calculate data. I have to calculate 10000 items with 1000 algorithms each. 1. Have 10 different groups of data 100 algorithms each in various parts of my program, where it locks and unlocks the DataTable when storing. or 2. Calculate 1000 algorithms all at once, locking and unlocking the DataTable only once. Because 2 is a larger set of data to calculate other processes might have to wait more. However 1 locks and unlocks many more times slowing down the process. What is considered the best practice or rule to follow. Thanks in advance, Michael

          P Offline
          P Offline
          Pete OHanlon
          wrote on last edited by
          #4

          There's a lot of context missing here that means we can't advise you on the best approach. Do the algorithms work on the same data out of the DataTable? Do they each work on a different row? Do the effects of their calculations update the DataTable? Is this DataTable loaded once and then readonly for the lifetime of the application?

          This space for rent

          1 Reply Last reply
          0
          • M MAW30

            That's not an option.

            L Offline
            L Offline
            Lost User
            wrote on last edited by
            #5

            Are there any other constraints you did not tell us about? If you don't mind the fact that you could loose all work that has already been done, then you calculate all results in memory and write them in one go. Otherwise, you'd best be working in batches.

            Bastard Programmer from Hell :suss: If you can't read my code, try converting it here[^][](X-Clacks-Overhead: GNU Terry Pratchett)

            1 Reply Last reply
            0
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Don't have an account? Register

            • Login or register to search.
            • First post
              Last post
            0
            • Categories
            • Recent
            • Tags
            • Popular
            • World
            • Users
            • Groups