Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. Loading 500000 rows in to a datatable

Loading 500000 rows in to a datatable

Scheduled Pinned Locked Moved C#
cssperformancehelp
7 Posts 5 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C Offline
    C Offline
    chandler83
    wrote on last edited by
    #1

    hi i am loading 500000 records in to a datatable and going thro each and every record. It is taking a large amount of time . Is there any possibility or efficient ways in which we can handle it easily....less memory resouse utilization is must plzzzz help me........ thanks.....:sigh:

    N B V 3 Replies Last reply
    0
    • C chandler83

      hi i am loading 500000 records in to a datatable and going thro each and every record. It is taking a large amount of time . Is there any possibility or efficient ways in which we can handle it easily....less memory resouse utilization is must plzzzz help me........ thanks.....:sigh:

      N Offline
      N Offline
      Not Active
      wrote on last edited by
      #2

      Where is the data being loaded from? Why are you loading some much? What do you mean, "going thro each and every record"? Lots of unanswered questions here.


      only two letters away from being an asset

      C 1 Reply Last reply
      0
      • C chandler83

        hi i am loading 500000 records in to a datatable and going thro each and every record. It is taking a large amount of time . Is there any possibility or efficient ways in which we can handle it easily....less memory resouse utilization is must plzzzz help me........ thanks.....:sigh:

        B Offline
        B Offline
        Bassam Saoud
        wrote on last edited by
        #3

        you can use paging to minimize the mount of data you are retrieving , or if you are searching then maybe you can do that process on the SQL Server side or whatever you are using...

        1 Reply Last reply
        0
        • N Not Active

          Where is the data being loaded from? Why are you loading some much? What do you mean, "going thro each and every record"? Lots of unanswered questions here.


          only two letters away from being an asset

          C Offline
          C Offline
          chandler83
          wrote on last edited by
          #4

          i am making a connection to a text file thro ODBC connectivity and filling the datatable.... tht txt file may contain records from 5000 to 500000+ i need to loop thro each record and do some processing...?

          N C 2 Replies Last reply
          0
          • C chandler83

            i am making a connection to a text file thro ODBC connectivity and filling the datatable.... tht txt file may contain records from 5000 to 500000+ i need to loop thro each record and do some processing...?

            N Offline
            N Offline
            Not Active
            wrote on last edited by
            #5

            chandler83 wrote:

            filling the datatable

            Do you mean the table in the database or DataTable object?


            only two letters away from being an asset

            1 Reply Last reply
            0
            • C chandler83

              hi i am loading 500000 records in to a datatable and going thro each and every record. It is taking a large amount of time . Is there any possibility or efficient ways in which we can handle it easily....less memory resouse utilization is must plzzzz help me........ thanks.....:sigh:

              V Offline
              V Offline
              V 0
              wrote on last edited by
              #6

              I had to update/insert a couple of 1000's records once. What I did was concatenate the inserts (delimit with ";") and fire it up to the database at once... eg. "INSERT INTO YOURTABLE (COLNAMES) VALUES (COLVALUES);INSERT INTO YOURTABLE (COLNAMES) VALUES (COLVALUES);...;INSERT INTO YOURTABLE (COLNAMES) VALUES (COLVALUES);" In your case don't do it at once, but in blocks. Hope this helps.

              V. No hurries, no worries

              1 Reply Last reply
              0
              • C chandler83

                i am making a connection to a text file thro ODBC connectivity and filling the datatable.... tht txt file may contain records from 5000 to 500000+ i need to loop thro each record and do some processing...?

                C Offline
                C Offline
                Christian Graus
                wrote on last edited by
                #7

                You should prepare your data ( do the loop ) then process it to the DB all at once, or at least in batches.  One DB call per record will indeed take forever.

                Christian Graus - C++ MVP 'Why don't we jump on a fad that hasn't already been widely discredited ?' - Dilbert

                1 Reply Last reply
                0
                Reply
                • Reply as topic
                Log in to reply
                • Oldest to Newest
                • Newest to Oldest
                • Most Votes


                • Login

                • Don't have an account? Register

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • World
                • Users
                • Groups