fastest method
-
I have 25 text files, each file have half million of records , my job is to read from text file and create database and write all the records on fly. So which is the fastest method, so that i write quickly to the db.
“You will never be a leader unless you first learn to follow and be led.” –Tiorio "Coming together is a beginning, staying together is progress, and working together is success." Henry Ford
-
I have 25 text files, each file have half million of records , my job is to read from text file and create database and write all the records on fly. So which is the fastest method, so that i write quickly to the db.
“You will never be a leader unless you first learn to follow and be led.” –Tiorio "Coming together is a beginning, staying together is progress, and working together is success." Henry Ford
Use streams. I usually prefer StreamReader. DON'T use the static methods of the 'File' class. They are orders of magnitude slower than streams. To verify that just write a small benchmark app using a System.Diagnostics.Stopwatch to test the speed of various methods.
-
I have 25 text files, each file have half million of records , my job is to read from text file and create database and write all the records on fly. So which is the fastest method, so that i write quickly to the db.
“You will never be a leader unless you first learn to follow and be led.” –Tiorio "Coming together is a beginning, staying together is progress, and working together is success." Henry Ford
-
I have 25 text files, each file have half million of records , my job is to read from text file and create database and write all the records on fly. So which is the fastest method, so that i write quickly to the db.
“You will never be a leader unless you first learn to follow and be led.” –Tiorio "Coming together is a beginning, staying together is progress, and working together is success." Henry Ford
It will stay pretty slow, but what you could do is: if a query fails, log it. that way you can do those manually if necessary (after finding the error ;-)).
V.
Stop smoking so you can: Enjoy longer the money you save. Moviereview Archive -
N a v a n e e t h wrote:
Use DTS in SQL server
Doesn't exist any more. Its sucessor is called SSIS (Sql Server Integration Service)
Thanks. Good to learn from you :)
All C# applications should call Application.Quit(); in the beginning to avoid any .NET problems.- Unclyclopedia How to use google | Ask smart questions
-
I have 25 text files, each file have half million of records , my job is to read from text file and create database and write all the records on fly. So which is the fastest method, so that i write quickly to the db.
“You will never be a leader unless you first learn to follow and be led.” –Tiorio "Coming together is a beginning, staying together is progress, and working together is success." Henry Ford
If these files have an identifiable structure, and don't require lots of complicated transformations, you could always bulk import them using bcp (I'm taking a flyer here that you are using SQL Server), and a format file[^] to constrain what goes where.
Deja View - the feeling that you've seen this post before.
-
If these files have an identifiable structure, and don't require lots of complicated transformations, you could always bulk import them using bcp (I'm taking a flyer here that you are using SQL Server), and a format file[^] to constrain what goes where.
Deja View - the feeling that you've seen this post before.
-
Is Sqlbulkcopy is the feasible solution ??????
“You will never be a leader unless you first learn to follow and be led.” –Tiorio "Coming together is a beginning, staying together is progress, and working together is success." Henry Ford
Depending on the format of your files, yes. If your file doesn't have the data you need split over multiple lines, then it's extremely feasible.
Deja View - the feeling that you've seen this post before.
-
Is Sqlbulkcopy is the feasible solution ??????
“You will never be a leader unless you first learn to follow and be led.” –Tiorio "Coming together is a beginning, staying together is progress, and working together is success." Henry Ford
We have this exact situation and use a combination of Bulkcopy for foxpro files and BCP for the text files. As Pete said you must have a consistent file structure. We have one situation where the file structure changes, in this case we read in the first line of the text file, pass that string to a stored procedure which drops and creates the target table if the structure has changed. We then use BulkCopy to load the table. Caveat - make ALL your target fields varchar and do your transforms using the stored procedure called AFTER the bulkcopy.
Never underestimate the power of human stupidity RAH
-
Depending on the format of your files, yes. If your file doesn't have the data you need split over multiple lines, then it's extremely feasible.
Deja View - the feeling that you've seen this post before.
+1 To BULK copy and format files.
-
We have this exact situation and use a combination of Bulkcopy for foxpro files and BCP for the text files. As Pete said you must have a consistent file structure. We have one situation where the file structure changes, in this case we read in the first line of the text file, pass that string to a stored procedure which drops and creates the target table if the structure has changed. We then use BulkCopy to load the table. Caveat - make ALL your target fields varchar and do your transforms using the stored procedure called AFTER the bulkcopy.
Never underestimate the power of human stupidity RAH
Mycroft Holmes wrote:
make ALL your target fields varchar and do your transforms using the stored procedure called AFTER the bulkcopy
Pretty much what I do too. I have a "raw" version of the table that's all varchar and a trigger to convert and move the data on to where it needs to be.
-
Mycroft Holmes wrote:
make ALL your target fields varchar and do your transforms using the stored procedure called AFTER the bulkcopy
Pretty much what I do too. I have a "raw" version of the table that's all varchar and a trigger to convert and move the data on to where it needs to be.
PIEBALDconsult wrote:
and a trigger to convert and move the data
You will not find a single trigger in any database I have ever been resposible for, hate the bloody things. I remember the first time I ran across them, spent 2 days trying to find out why a process failed when I wasn't anywhere near the table that caused the error - bloody triggers :mad:
Never underestimate the power of human stupidity RAH
-
PIEBALDconsult wrote:
and a trigger to convert and move the data
You will not find a single trigger in any database I have ever been resposible for, hate the bloody things. I remember the first time I ran across them, spent 2 days trying to find out why a process failed when I wasn't anywhere near the table that caused the error - bloody triggers :mad:
Never underestimate the power of human stupidity RAH
I agree, but they seem a reasonable solution to the loading-via-bcp situation: A one-line BAT file is all that's required to perform the load. The whole operation is performed in a transaction, so if the data is invalid I'm not left with invalid data in the "raw" table.