Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
CODE PROJECT For Those Who Code
  • Home
  • Articles
  • FAQ
Community
  1. Home
  2. Web Development
  3. ASP.NET
  4. How to de-batch very large XMLDocument into bit size chunks

How to de-batch very large XMLDocument into bit size chunks

Scheduled Pinned Locked Moved ASP.NET
xmltutorial
4 Posts 3 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • O Offline
    O Offline
    ONeil Tomlinson
    wrote on last edited by
    #1

    I have a very large XMLDocument with 56,000 child records each having 8 child field elements. The application that i have written performs a complicated process on the XML. It can process 10,000 records in 5 minutes and 20,000 in 25 minutes (increases exponentially) but takes maybe 10 hours to process the 56,000 record (which is unacceptable). Whats the best way to de-batch this XML Document into bite size chunks (eg 10,000 and after 5 iteration will process 50,000. which will execute in 25-30 minutes processing time) Thanks.

    N R 2 Replies Last reply
    0
    • O ONeil Tomlinson

      I have a very large XMLDocument with 56,000 child records each having 8 child field elements. The application that i have written performs a complicated process on the XML. It can process 10,000 records in 5 minutes and 20,000 in 25 minutes (increases exponentially) but takes maybe 10 hours to process the 56,000 record (which is unacceptable). Whats the best way to de-batch this XML Document into bite size chunks (eg 10,000 and after 5 iteration will process 50,000. which will execute in 25-30 minutes processing time) Thanks.

      N Offline
      N Offline
      Not Active
      wrote on last edited by
      #2

      Have you tried any multi-threading? I'd also revisit the algorithm, or work being done; it seems excessive to use that much time.


      I know the language. I've read a book. - _Madmatt

      1 Reply Last reply
      0
      • O ONeil Tomlinson

        I have a very large XMLDocument with 56,000 child records each having 8 child field elements. The application that i have written performs a complicated process on the XML. It can process 10,000 records in 5 minutes and 20,000 in 25 minutes (increases exponentially) but takes maybe 10 hours to process the 56,000 record (which is unacceptable). Whats the best way to de-batch this XML Document into bite size chunks (eg 10,000 and after 5 iteration will process 50,000. which will execute in 25-30 minutes processing time) Thanks.

        R Offline
        R Offline
        Rutvik Dave
        wrote on last edited by
        #3

        You can try these 3 ways... 1. as Mark suggested use multithreading 2. load the xml doument in datatable and perform the task, it will be much more faster then processing a file on the disk. 3. if possible try to create a global temp table in sql server (##tablename, so that you don't lose the table on connecting close), and insert the xml document in the table and after that you can fire Select / Update statement on the table and get desired result. you can use all of the 3 at the same time to get the improved performance...

        modified on Sunday, February 28, 2010 6:39 PM

        O 1 Reply Last reply
        0
        • R Rutvik Dave

          You can try these 3 ways... 1. as Mark suggested use multithreading 2. load the xml doument in datatable and perform the task, it will be much more faster then processing a file on the disk. 3. if possible try to create a global temp table in sql server (##tablename, so that you don't lose the table on connecting close), and insert the xml document in the table and after that you can fire Select / Update statement on the table and get desired result. you can use all of the 3 at the same time to get the improved performance...

          modified on Sunday, February 28, 2010 6:39 PM

          O Offline
          O Offline
          ONeil Tomlinson
          wrote on last edited by
          #4

          Thanks i'll give that a try

          1 Reply Last reply
          0
          Reply
          • Reply as topic
          Log in to reply
          • Oldest to Newest
          • Newest to Oldest
          • Most Votes


          • Login

          • Don't have an account? Register

          • Login or register to search.
          • First post
            Last post
          0
          • Categories
          • Recent
          • Tags
          • Popular
          • World
          • Users
          • Groups