Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
CODE PROJECT For Those Who Code
  • Home
  • Articles
  • FAQ
Community
  1. Home
  2. General Programming
  3. C#
  4. how to enhance Dataset.merge() performance with larg number of records

how to enhance Dataset.merge() performance with larg number of records

Scheduled Pinned Locked Moved C#
performancehelptutorialquestion
5 Posts 2 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • S Offline
    S Offline
    smr85
    wrote on last edited by
    #1

    hello all ,i am using the function Dataset.Merge(table, false, MissingSchemaAction.Ignore), to merge larg amount of table ..about 40 tables,the problem appear when some tables have large number of recordes...hence the performance become sooooooo bad,it take about 6...7 m to complete merging for that table to that data set...so does anyone know how to enhance the performance for it ?? or if there is any workaround ??

    M 1 Reply Last reply
    0
    • S smr85

      hello all ,i am using the function Dataset.Merge(table, false, MissingSchemaAction.Ignore), to merge larg amount of table ..about 40 tables,the problem appear when some tables have large number of recordes...hence the performance become sooooooo bad,it take about 6...7 m to complete merging for that table to that data set...so does anyone know how to enhance the performance for it ?? or if there is any workaround ??

      M Offline
      M Offline
      Mycroft Holmes
      wrote on last edited by
      #2

      So you have taken a dodgy bit of technology (dataset, data processing) designed at best for small volume jobs (2-3 tables and limited rows) and beaten it to death with 40 tables and large volumes of data. I beleive you should move all this processing to a database and reload the data after doing the work in the database. Secondly I think your design is a disaster looking for a home, anyone loading 40 tables into 1 dataset is asking for trouble. I don't know the design spec for datasets but I have never considered loading more than 3 (small) tables and have NEVER used .merge! I can feel a rewrite coming your way!

      Never underestimate the power of human stupidity RAH

      S 1 Reply Last reply
      0
      • M Mycroft Holmes

        So you have taken a dodgy bit of technology (dataset, data processing) designed at best for small volume jobs (2-3 tables and limited rows) and beaten it to death with 40 tables and large volumes of data. I beleive you should move all this processing to a database and reload the data after doing the work in the database. Secondly I think your design is a disaster looking for a home, anyone loading 40 tables into 1 dataset is asking for trouble. I don't know the design spec for datasets but I have never considered loading more than 3 (small) tables and have NEVER used .merge! I can feel a rewrite coming your way!

        Never underestimate the power of human stupidity RAH

        S Offline
        S Offline
        smr85
        wrote on last edited by
        #3

        it is a large system and we have to do that,anyway even in case you just have a one table but it contain alot of records ...the performance will not be acceptable,so have you solution for that ?? but just to enhance its performance but not in the database ??

        M 1 Reply Last reply
        0
        • S smr85

          it is a large system and we have to do that,anyway even in case you just have a one table but it contain alot of records ...the performance will not be acceptable,so have you solution for that ?? but just to enhance its performance but not in the database ??

          M Offline
          M Offline
          Mycroft Holmes
          wrote on last edited by
          #4

          Sorry, as far as I'm concerned you are using the wrong tool for the job. This is a design issue NOT a performance issue. If you take a small car and hook up a 5 tonne trailer, you can't complain about the performance, you need a bigger engine and thats the database.

          Never underestimate the power of human stupidity RAH

          S 1 Reply Last reply
          0
          • M Mycroft Holmes

            Sorry, as far as I'm concerned you are using the wrong tool for the job. This is a design issue NOT a performance issue. If you take a small car and hook up a 5 tonne trailer, you can't complain about the performance, you need a bigger engine and thats the database.

            Never underestimate the power of human stupidity RAH

            S Offline
            S Offline
            smr85
            wrote on last edited by
            #5

            thanks alot for your replay , but the problem i need that before commit it in database,as the data mergerd come from UI and some threads....,so it cannot be handel in Data base...but in case i implement that function ,do you think it may diffrent in performance,??

            1 Reply Last reply
            0
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Don't have an account? Register

            • Login or register to search.
            • First post
              Last post
            0
            • Categories
            • Recent
            • Tags
            • Popular
            • World
            • Users
            • Groups