Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. Help with DataTable update

Help with DataTable update

Scheduled Pinned Locked Moved C#
csharpregexperformancehelpannouncement
2 Posts 1 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • M Offline
    M Offline
    Matthew Cuba
    wrote on last edited by
    #1

    Hello, I am trying to speed up the update of a DataTable. The short story is that I have a DataTable 'A' that has approximately 10-15 columns of information, depending on the data loaded. I have a couple of extra columns that get added to the table at runtime and I need to populate those columns based on the values in another table, 'B', that has 3 columns: an ItemID column for matching the ItemId column in table A and two other columns. I could have as many as 10K rows in DataTable A and the number of rows in DataTable B could vary. It is possible, too, that DataTable B may have rows where the ItemId doesn't match any ItemId in DataTable A, and it those instances, I want to ignore the rows. My current solution involves something like this: DataColumn aColNew1 = tableA.Columns["New1"]; DataColumn aColNew2 = tableA.Columns["New2"]; DataColumn bColNew1 = tableB.Columns["New1"]; DataColumn bColNew2 = tableB.Columns["New2"]; DataColumn bColItemId = tableB.Columns["ItemId"]; DataRow[] bRows = tableB.Select(); for (int i = 0; i < bRows.Length; ++i) { DataRow[] aRows = tableA.Select("ItemId = '" + bRows[i][bColID].ToString() + "'"); if (aRows.Length > 0) { aRows[0][aColNew1] = bRows[i][bColNew1]; aRows[0][aColNew2] = bRows[i][bColNew2]; } } Now, I'm am pretty new to C#, so this might not be the most efficient way to do this. I hope it isn't, because it is taking a considerable amount of time (close to 40 seconds) to execute this when tables A and B have around 10K rows. Any suggestions on how I can speed this up would be much appreciated. Thanks, Matt

    M 1 Reply Last reply
    0
    • M Matthew Cuba

      Hello, I am trying to speed up the update of a DataTable. The short story is that I have a DataTable 'A' that has approximately 10-15 columns of information, depending on the data loaded. I have a couple of extra columns that get added to the table at runtime and I need to populate those columns based on the values in another table, 'B', that has 3 columns: an ItemID column for matching the ItemId column in table A and two other columns. I could have as many as 10K rows in DataTable A and the number of rows in DataTable B could vary. It is possible, too, that DataTable B may have rows where the ItemId doesn't match any ItemId in DataTable A, and it those instances, I want to ignore the rows. My current solution involves something like this: DataColumn aColNew1 = tableA.Columns["New1"]; DataColumn aColNew2 = tableA.Columns["New2"]; DataColumn bColNew1 = tableB.Columns["New1"]; DataColumn bColNew2 = tableB.Columns["New2"]; DataColumn bColItemId = tableB.Columns["ItemId"]; DataRow[] bRows = tableB.Select(); for (int i = 0; i < bRows.Length; ++i) { DataRow[] aRows = tableA.Select("ItemId = '" + bRows[i][bColID].ToString() + "'"); if (aRows.Length > 0) { aRows[0][aColNew1] = bRows[i][bColNew1]; aRows[0][aColNew2] = bRows[i][bColNew2]; } } Now, I'm am pretty new to C#, so this might not be the most efficient way to do this. I hope it isn't, because it is taking a considerable amount of time (close to 40 seconds) to execute this when tables A and B have around 10K rows. Any suggestions on how I can speed this up would be much appreciated. Thanks, Matt

      M Offline
      M Offline
      Matthew Cuba
      wrote on last edited by
      #2

      The Select() calls were killing me. It was actually faster to just select everything, sorted by ItemId and iterate through the two arrays using a while-loop nested within a for-loop, bailing out of the while whenever I got a match or when the Id I was comparing was greater than the current one in the while loop. Ended up cutting down the processing time to maybe a second or so. Should have realized how inefficient the original algorithm was. :(

      1 Reply Last reply
      0
      Reply
      • Reply as topic
      Log in to reply
      • Oldest to Newest
      • Newest to Oldest
      • Most Votes


      • Login

      • Don't have an account? Register

      • Login or register to search.
      • First post
        Last post
      0
      • Categories
      • Recent
      • Tags
      • Popular
      • World
      • Users
      • Groups