Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. Database & SysAdmin
  3. Database
  4. Limitation of DataSet

Limitation of DataSet

Scheduled Pinned Locked Moved Database
databaseoraclequestion
6 Posts 4 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • B Offline
    B Offline
    btsrinath
    wrote on last edited by
    #1

    Does any one knows how many records can be inserted/fetched to datatable with in dataset ? I am getting few millions of records from oracle database, and when I use DataAdapter.Fill(dataset) method or Dataset.Tables.Rows.Add(datarow) method i get "Exception of type System.OutOfMemoryException was thrown." Does any one has idea on this ? Srinath

    C A E 3 Replies Last reply
    0
    • B btsrinath

      Does any one knows how many records can be inserted/fetched to datatable with in dataset ? I am getting few millions of records from oracle database, and when I use DataAdapter.Fill(dataset) method or Dataset.Tables.Rows.Add(datarow) method i get "Exception of type System.OutOfMemoryException was thrown." Does any one has idea on this ? Srinath

      C Offline
      C Offline
      Colin Angus Mackay
      wrote on last edited by
      #2

      btsrinath wrote:

      Does any one knows how many records can be inserted/fetched to datatable with in dataset ?

      ~2 billion or the limitations of your computer's memory, which ever comes first.

      btsrinath wrote:

      I am getting few millions of records from oracle database

      Do you really need a few million rows? What exactly are you going to do with all that information on the client?

      btsrinath wrote:

      i get "Exception of type System.OutOfMemoryException was thrown." Does any one has idea on this ?

      Looks like you've run out of memory.


      Upcoming events: * Glasgow Geek Dinner (5th March) * Glasgow: Tell us what you want to see in 2007 My: Website | Blog | Photos

      B 1 Reply Last reply
      0
      • C Colin Angus Mackay

        btsrinath wrote:

        Does any one knows how many records can be inserted/fetched to datatable with in dataset ?

        ~2 billion or the limitations of your computer's memory, which ever comes first.

        btsrinath wrote:

        I am getting few millions of records from oracle database

        Do you really need a few million rows? What exactly are you going to do with all that information on the client?

        btsrinath wrote:

        i get "Exception of type System.OutOfMemoryException was thrown." Does any one has idea on this ?

        Looks like you've run out of memory.


        Upcoming events: * Glasgow Geek Dinner (5th March) * Glasgow: Tell us what you want to see in 2007 My: Website | Blog | Photos

        B Offline
        B Offline
        btsrinath
        wrote on last edited by
        #3

        Thank you. I need to process all these records. Just Imagine for every 50 records from Table1, I need to insert 1 record to Table2. There are calculation to be done on these 50 records first and then I have to insert that into database. For getting 7000 records from DB it takes ~1hr 20mins. So you can just Imagine how many records are there in DB. Instead of processing these on DB using proc, I thought bringing the records on client side and processing them would reduce load on DB. If I regularly update the dataset and data adapter by methods AcceptChanges and Update methods resepectively would the memory usage would reduce ? Can please suggest an good way of solving my problem ? Srinath

        C 1 Reply Last reply
        0
        • B btsrinath

          Thank you. I need to process all these records. Just Imagine for every 50 records from Table1, I need to insert 1 record to Table2. There are calculation to be done on these 50 records first and then I have to insert that into database. For getting 7000 records from DB it takes ~1hr 20mins. So you can just Imagine how many records are there in DB. Instead of processing these on DB using proc, I thought bringing the records on client side and processing them would reduce load on DB. If I regularly update the dataset and data adapter by methods AcceptChanges and Update methods resepectively would the memory usage would reduce ? Can please suggest an good way of solving my problem ? Srinath

          C Offline
          C Offline
          Colin Angus Mackay
          wrote on last edited by
          #4

          btsrinath wrote:

          For getting 7000 records from DB it takes ~1hr 20mins. So you can just Imagine how many records are there in DB.

          Or (possibly) how poorly configured the indexes are.

          btsrinath wrote:

          If I regularly update the dataset and data adapter by methods AcceptChanges and Update methods resepectively would the memory usage would reduce ?

          Unlikely. As a general rule I simply don't use DataSets becuase of the memory overheads. I only use them for coding things in a quick and dirty way where speed of coding is preferred.

          btsrinath wrote:

          Can please suggest an good way of solving my problem ?

          If you are doing any complex joins, judicious use of the tempDB can improve performance. Pulling data in to temp tables then performing Joins between these temp tables with a fraction of the rows is much faster than the same join on the original table. A few years ago I speeded up a query that was taking 20+ minutes down to 7 seconds this way. (There were billions of rows in the table, but my calculation only required joining a few thousand of them) However, without knowing more about what you are doing any advice I give may be completely useless to you.


          Upcoming events: * Glasgow Geek Dinner (5th March) * Glasgow: Tell us what you want to see in 2007 My: Website | Blog | Photos

          1 Reply Last reply
          0
          • B btsrinath

            Does any one knows how many records can be inserted/fetched to datatable with in dataset ? I am getting few millions of records from oracle database, and when I use DataAdapter.Fill(dataset) method or Dataset.Tables.Rows.Add(datarow) method i get "Exception of type System.OutOfMemoryException was thrown." Does any one has idea on this ? Srinath

            A Offline
            A Offline
            Aaron VanWieren
            wrote on last edited by
            #5

            You should really look at your data. I would try and find some way to query just a fraction of the data from the database. Often these issues can be solved by looking at your database from another angle. How many records at a time are you processing, can they be processed in smaller batches? Really look at your data. Also, you might want to try and right the results back to the main db from time to time, anything to free up resources. It sounds like you are trying to do too much at one time. I hope this helps.

            _____________________________________________________________________ Our developers never release code. Rather, it tends to escape, pillaging the countryside all around. The Enlightenment Project (paraphrased comment) Visit Me at GISDevCafe

            1 Reply Last reply
            0
            • B btsrinath

              Does any one knows how many records can be inserted/fetched to datatable with in dataset ? I am getting few millions of records from oracle database, and when I use DataAdapter.Fill(dataset) method or Dataset.Tables.Rows.Add(datarow) method i get "Exception of type System.OutOfMemoryException was thrown." Does any one has idea on this ? Srinath

              E Offline
              E Offline
              Ennis Ray Lynch Jr
              wrote on last edited by
              #6

              If you are processing one record at a time, ie. 1 million customer orders, use a data reader to scan through and a forward only manner using less memory. If you are using aggregate results SQL is a powerful language designed to work on large sets of data very efficiently. Last note: if you have to work with large chucks of data in memory find a natural divider and use smaller sets.


              File Not Found

              1 Reply Last reply
              0
              Reply
              • Reply as topic
              Log in to reply
              • Oldest to Newest
              • Newest to Oldest
              • Most Votes


              • Login

              • Don't have an account? Register

              • Login or register to search.
              • First post
                Last post
              0
              • Categories
              • Recent
              • Tags
              • Popular
              • World
              • Users
              • Groups