Loading 500000 rows in to a datatable
-
hi i am loading 500000 records in to a datatable and going thro each and every record. It is taking a large amount of time . Is there any possibility or efficient ways in which we can handle it easily....less memory resouse utilization is must plzzzz help me........ thanks.....:sigh:
-
hi i am loading 500000 records in to a datatable and going thro each and every record. It is taking a large amount of time . Is there any possibility or efficient ways in which we can handle it easily....less memory resouse utilization is must plzzzz help me........ thanks.....:sigh:
Where is the data being loaded from? Why are you loading some much? What do you mean, "going thro each and every record"? Lots of unanswered questions here.
only two letters away from being an asset
-
hi i am loading 500000 records in to a datatable and going thro each and every record. It is taking a large amount of time . Is there any possibility or efficient ways in which we can handle it easily....less memory resouse utilization is must plzzzz help me........ thanks.....:sigh:
you can use paging to minimize the mount of data you are retrieving , or if you are searching then maybe you can do that process on the SQL Server side or whatever you are using...
-
Where is the data being loaded from? Why are you loading some much? What do you mean, "going thro each and every record"? Lots of unanswered questions here.
only two letters away from being an asset
i am making a connection to a text file thro ODBC connectivity and filling the datatable.... tht txt file may contain records from 5000 to 500000+ i need to loop thro each record and do some processing...?
-
i am making a connection to a text file thro ODBC connectivity and filling the datatable.... tht txt file may contain records from 5000 to 500000+ i need to loop thro each record and do some processing...?
chandler83 wrote:
filling the datatable
Do you mean the table in the database or DataTable object?
only two letters away from being an asset
-
hi i am loading 500000 records in to a datatable and going thro each and every record. It is taking a large amount of time . Is there any possibility or efficient ways in which we can handle it easily....less memory resouse utilization is must plzzzz help me........ thanks.....:sigh:
I had to update/insert a couple of 1000's records once. What I did was concatenate the inserts (delimit with ";") and fire it up to the database at once... eg.
"INSERT INTO YOURTABLE (COLNAMES) VALUES (COLVALUES);INSERT INTO YOURTABLE (COLNAMES) VALUES (COLVALUES);...;INSERT INTO YOURTABLE (COLNAMES) VALUES (COLVALUES);"
In your case don't do it at once, but in blocks. Hope this helps.V. No hurries, no worries
-
i am making a connection to a text file thro ODBC connectivity and filling the datatable.... tht txt file may contain records from 5000 to 500000+ i need to loop thro each record and do some processing...?
You should prepare your data ( do the loop ) then process it to the DB all at once, or at least in batches. One DB call per record will indeed take forever.
Christian Graus - C++ MVP 'Why don't we jump on a fad that hasn't already been widely discredited ?' - Dilbert