Performance issue
-
Hi all, In my project i am reading a binary file of 81MB size and this information need to be inserted in to different tables of database. The binary file is having several sections, contains some millions of records. while reading the binray information i am conveting the bytes in to c# datatypes and saving in arraylist, finally i am inserting in to the database Which is taking huge amount of time. to insert this 81 MB size file in to database my application is taking 30 min. Can any one suggest me how can i optimize the performance? Thanks in advance
-
Hi all, In my project i am reading a binary file of 81MB size and this information need to be inserted in to different tables of database. The binary file is having several sections, contains some millions of records. while reading the binray information i am conveting the bytes in to c# datatypes and saving in arraylist, finally i am inserting in to the database Which is taking huge amount of time. to insert this 81 MB size file in to database my application is taking 30 min. Can any one suggest me how can i optimize the performance? Thanks in advance
No issue here re
-
Hi all, In my project i am reading a binary file of 81MB size and this information need to be inserted in to different tables of database. The binary file is having several sections, contains some millions of records. while reading the binray information i am conveting the bytes in to c# datatypes and saving in arraylist, finally i am inserting in to the database Which is taking huge amount of time. to insert this 81 MB size file in to database my application is taking 30 min. Can any one suggest me how can i optimize the performance? Thanks in advance
Sorry, previously what i given response is wrong. ignore it:laugh::laugh: satish......
-
Hi all, In my project i am reading a binary file of 81MB size and this information need to be inserted in to different tables of database. The binary file is having several sections, contains some millions of records. while reading the binray information i am conveting the bytes in to c# datatypes and saving in arraylist, finally i am inserting in to the database Which is taking huge amount of time. to insert this 81 MB size file in to database my application is taking 30 min. Can any one suggest me how can i optimize the performance? Thanks in advance
Hard to say. There are a lot of ways to boost performance. - Is the algorithm optimized? Are your for loops ok? (you don't perform actions in for loops that could be done outside eg.) - Do you have GUI updates like a progress bar or something? - Try an Ngen compilation, it might help (See MSDN) - Can you use threads? eg Read in a block of bytes and while the DB updates, Read the next block at the same time etc... - Did you use optimezed objects? (StringBuilder for string concatination eg.) Hope this helps...
V. If I don't see you in this world, I'll see you in the next one... And don't be late. (Jimi Hendrix)
-
Hi all, In my project i am reading a binary file of 81MB size and this information need to be inserted in to different tables of database. The binary file is having several sections, contains some millions of records. while reading the binray information i am conveting the bytes in to c# datatypes and saving in arraylist, finally i am inserting in to the database Which is taking huge amount of time. to insert this 81 MB size file in to database my application is taking 30 min. Can any one suggest me how can i optimize the performance? Thanks in advance
Hi, You are use following tips/suggestions/whatever :) 1. Dont use INSERT when "LOADING" the data, use BULK INSERT 2. You can convert the data into c# format, and dump it as required in some temporary file, and bulk insert data using that file. If you are good at threading, you can boost the performance considerably. 3. If you are comfortable with Dynamic SQL, use it after converting data to c# datatypes. And if you are good in threading and dynamic SQL then you can have a boost in performance as "complete loading of a 45 GigaBytes of data to 12 tables in 1.5 hours". That is what i experienced once! Hope that much performance is enough for now and you find it useful! :) Regards, Adeel
Do rate the reply, if it helps or even if it doesnot, because it helps the members to know, what solved the issue. Thanks.
-
Hi, You are use following tips/suggestions/whatever :) 1. Dont use INSERT when "LOADING" the data, use BULK INSERT 2. You can convert the data into c# format, and dump it as required in some temporary file, and bulk insert data using that file. If you are good at threading, you can boost the performance considerably. 3. If you are comfortable with Dynamic SQL, use it after converting data to c# datatypes. And if you are good in threading and dynamic SQL then you can have a boost in performance as "complete loading of a 45 GigaBytes of data to 12 tables in 1.5 hours". That is what i experienced once! Hope that much performance is enough for now and you find it useful! :) Regards, Adeel
Do rate the reply, if it helps or even if it doesnot, because it helps the members to know, what solved the issue. Thanks.
Just for information, are you developing data warehouse?
Do rate the reply, if it helps or even if it doesnot, because it helps the members to know, what solved the issue. Thanks.