Adventures in Async
-
Never bothered with Async programming before since I never needed it. But now I'm having to take care of a weekly delivery of an 80 GB (eighty gigabyte) large XML-file. The parsing and saving 10 million records to 30 different tables in a database takes more than an hour and there's no simple optimization left to do. But I only use one kernel in the processor, so let's go parallell, it'll be fun learning. Right? Easiest part is bulk copying to the database in parallel. Easy enough but it only shaves five minutes from the total time. This is not where the biggest bottleneck is. The biggest bottleneck is the actual parsing of the XML. I don't want to rework the whole application into using locks and thread-safe collections so I decide to split the work vertically instead. Add a task for every collection of data. Also easy enough, now the processor is working close to 100%, but it takes twice as long. :wtf: Apparently the creation of tasks has more overhead than the parsing of the data itself. :laugh: No shortcuts for me today. Back to the drawing board.
Wrong is evil and must be defeated. - Jeff Ello
Welcome to the cool club though. Ladies can't resist an async coder. #science
Jeremy Falcon
-
More proof that some people have real problems. So stop complaining people, you could be Jörgen today.
Ron Anders wrote:
So stop complaining people, you could be Jörgen today.
...and have no toilet paper.
Jeremy Falcon
-
why are u even parsing xml files and that too 80gb !!! and then saving it to the database !!! .. u could try to use the sql server bulk import tools to do this and avoid programming such stuff all together...
Caveat Emptor. "Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long
Because I want to have the data extracted into normalized tables.
Wrong is evil and must be defeated. - Jeff Ello
-
Welcome to the cool club though. Ladies can't resist an async coder. #science
Jeremy Falcon
That's seriously the best answer today. :-D
Wrong is evil and must be defeated. - Jeff Ello
-
That's seriously the best answer today. :-D
Wrong is evil and must be defeated. - Jeff Ello
:-D
Jeremy Falcon
-
More proof that some people have real problems. So stop complaining people, you could be Jörgen today.
Isn't it enough if I'm being me?
Wrong is evil and must be defeated. - Jeff Ello
-
Because I want to have the data extracted into normalized tables.
Wrong is evil and must be defeated. - Jeff Ello
if u have sql server there is SSIS anyway... [Importing XML documents using SQL Server Integration Services](https://www.mssqltips.com/sqlservertip/3141/importing-xml-documents-using-sql-server-integration-services/)
Caveat Emptor. "Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long
-
if u have sql server there is SSIS anyway... [Importing XML documents using SQL Server Integration Services](https://www.mssqltips.com/sqlservertip/3141/importing-xml-documents-using-sql-server-integration-services/)
Caveat Emptor. "Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long
I've missed out on that possibility completely. A bit late now, but I'll take a look at it anyway. :thumbsup:
Wrong is evil and must be defeated. - Jeff Ello
-
if u have sql server there is SSIS anyway... [Importing XML documents using SQL Server Integration Services](https://www.mssqltips.com/sqlservertip/3141/importing-xml-documents-using-sql-server-integration-services/)
Caveat Emptor. "Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long
I think I see the reason why I missed out on that possibility, it does not seem to exist on SQL Server 2012.
Wrong is evil and must be defeated. - Jeff Ello
-
if u have sql server there is SSIS anyway... [Importing XML documents using SQL Server Integration Services](https://www.mssqltips.com/sqlservertip/3141/importing-xml-documents-using-sql-server-integration-services/)
Caveat Emptor. "Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long
I think I see the reason why I missed out on that possibility, XMLSource does not seem to exist on SQL Server 2012.
Wrong is evil and must be defeated. - Jeff Ello
-
Oh yes, as soon as your thread count exceeds the core count, you are going to get some slowdown. You need to be aware that threading is not a "magic bullet" that will solve all your performance woes at a stroke - it needs to be carefully though about and planned, or it can do two things: 1) Slow your machine to a crawl, and make your application considerably slower than it started out. 2) Crash or lock up your app completely. The reasons why are simple: 1) Threads require two things to run: memory and a free core. The memory will be at the very least the size of a system stack in your language (usually around 1MB for Windows, 8MB for Linux) plus some overhead for the thread itself and yet more for any memory based objects each thread creates; and a thread can only run when a core becomes available. If you generate more threads than you have cores then most of them will spend a lot of time sitting waiting for a thread to be available. The more threads you generate, the worse problems become: more threads puts more load on the system to switch threads more often and that takes core time as well. All threads ion the system form all processes share the cores in the machine, so other apps and System threads also need their time to run. Add too many, and the system will spend more and more of it's time trying to work out which thread to run and performance degrades. Generate enough threads to exceed the physical memory in your computer and performance suddenly takes an enormous hit as the virtual memory system comes in and starts thrashing memory pages to the HDD. 2) Multiple threads within a process have to be thread safe because they share memory and other resources - which means that several things can happen: 2a) If two threads need the same resource then you can easily end up in a situation where thread A has locked resource X and wants Y, while thread B has locked resource Y and wants X. At this point a "deadly embrace" has occurred and no other thread (nor any other that need X or Y can run ever again. 2b) If your code isn't thread safe, then different threads can try to read and / or alter the same memory at the same time: this often happens when trying to add or remove items from a collection. At this point strange things start to happen up to and including your app crashing. 2c) If resources have a finite capacity - like the bandwidth on an internet connection for example - then bad threading can easily use it all - at either end of the link. If you run out of capacity, your threads will stall
OriginalGriff wrote:
if you have a very large bus it is a slow way to get from A to B, but when you average it out over the large number of passengers it's pretty quick.
...something-something-bandwidth-of-a-station-wagon-carrying-storage-medium...
OriginalGriff wrote:
except you are putting a lot more vehicles on the same roads which means more chance of traffic jams, accidents, breakdowns, and so forth. Put too many on the same roads and they get blocked up with cars and nobody can move anywhere because there is a car in their way ...
There's a [meme](https://starecat.com/multithreaded-programming-theory-vs-actual-puppies-eating-from-bowls-mess/) for that...
-
Isn't it enough if I'm being me?
Wrong is evil and must be defeated. - Jeff Ello
-
Welcome to the cool club though. Ladies can't resist an async coder. #science
Jeremy Falcon
-
I think I see the reason why I missed out on that possibility, it does not seem to exist on SQL Server 2012.
Wrong is evil and must be defeated. - Jeff Ello
if its the dev env u have you can run the sql server setup and select the components needed to get SSIS services and vs based client tools.is on the iso or dvd etc.. .. also there is OPENROWSET [Simple way to Import XML Data into SQL Server with T-SQL](https://www.mssqltips.com/sqlservertip/5707/simple-way-to-import-xml-data-into-sql-server-with-tsql/) .....
Caveat Emptor. "Progress doesn't come from early risers – progress is made by lazy men looking for easier ways to do things." Lazarus Long
-
Jörgen Andersson wrote:
netizens of the lounge to have a laugh on my behalf.
We wouldn't do that! :laugh:
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony AntiTwitter: @DalekDave is now a follower!
No. Programming is hard.
Recursion is for programmers who haven't blown enough stacks yet.
-
Jörgen Andersson wrote:
netizens of the lounge to have a laugh on my behalf.
We wouldn't do that! :laugh:
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony AntiTwitter: @DalekDave is now a follower!
OriginalGriff wrote:
Jörgen Andersson wrote:
netizens of the lounge to have a laugh on my behalf.
We wouldn't do that! :laugh:
No. Programming is hard.
Recursion is for programmers who haven't blown enough stacks yet.
-
Never bothered with Async programming before since I never needed it. But now I'm having to take care of a weekly delivery of an 80 GB (eighty gigabyte) large XML-file. The parsing and saving 10 million records to 30 different tables in a database takes more than an hour and there's no simple optimization left to do. But I only use one kernel in the processor, so let's go parallell, it'll be fun learning. Right? Easiest part is bulk copying to the database in parallel. Easy enough but it only shaves five minutes from the total time. This is not where the biggest bottleneck is. The biggest bottleneck is the actual parsing of the XML. I don't want to rework the whole application into using locks and thread-safe collections so I decide to split the work vertically instead. Add a task for every collection of data. Also easy enough, now the processor is working close to 100%, but it takes twice as long. :wtf: Apparently the creation of tasks has more overhead than the parsing of the data itself. :laugh: No shortcuts for me today. Back to the drawing board.
Wrong is evil and must be defeated. - Jeff Ello
I can't help, but reading "parse" in the body of the message... This is clearly a case for... HONEY THE @CODE-WITCH tatatataaaaaaa :laugh: :laugh: :laugh: :laugh:
M.D.V. ;) If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about? Help me to understand what I'm saying, and I'll explain it better to you Rating helpful answers is nice, but saying thanks can be even nicer.
-
Never bothered with Async programming before since I never needed it. But now I'm having to take care of a weekly delivery of an 80 GB (eighty gigabyte) large XML-file. The parsing and saving 10 million records to 30 different tables in a database takes more than an hour and there's no simple optimization left to do. But I only use one kernel in the processor, so let's go parallell, it'll be fun learning. Right? Easiest part is bulk copying to the database in parallel. Easy enough but it only shaves five minutes from the total time. This is not where the biggest bottleneck is. The biggest bottleneck is the actual parsing of the XML. I don't want to rework the whole application into using locks and thread-safe collections so I decide to split the work vertically instead. Add a task for every collection of data. Also easy enough, now the processor is working close to 100%, but it takes twice as long. :wtf: Apparently the creation of tasks has more overhead than the parsing of the data itself. :laugh: No shortcuts for me today. Back to the drawing board.
Wrong is evil and must be defeated. - Jeff Ello
are you sure the bottleneck isnt disk i/o?
Real programmers use butterflies
-
Never bothered with Async programming before since I never needed it. But now I'm having to take care of a weekly delivery of an 80 GB (eighty gigabyte) large XML-file. The parsing and saving 10 million records to 30 different tables in a database takes more than an hour and there's no simple optimization left to do. But I only use one kernel in the processor, so let's go parallell, it'll be fun learning. Right? Easiest part is bulk copying to the database in parallel. Easy enough but it only shaves five minutes from the total time. This is not where the biggest bottleneck is. The biggest bottleneck is the actual parsing of the XML. I don't want to rework the whole application into using locks and thread-safe collections so I decide to split the work vertically instead. Add a task for every collection of data. Also easy enough, now the processor is working close to 100%, but it takes twice as long. :wtf: Apparently the creation of tasks has more overhead than the parsing of the data itself. :laugh: No shortcuts for me today. Back to the drawing board.
Wrong is evil and must be defeated. - Jeff Ello
-
40GB of your 80GB XML file are tags. So much for the overhead. The suggestion is to worldwide drop all markup languages (XML, JSON and similar shiite)