Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
CODE PROJECT For Those Who Code
  • Home
  • Articles
  • FAQ
Community
  1. Home
  2. Web Development
  3. ASP.NET
  4. System.OutOfMemoryException when writing a csv file

System.OutOfMemoryException when writing a csv file

Scheduled Pinned Locked Moved ASP.NET
helpcsharpcssasp-netdebugging
3 Posts 2 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • K Offline
    K Offline
    K Safvi
    wrote on last edited by
    #1

    Dear All, In download data module of my application(with asp.net2.0/sqlserver2005)one of the function is returning a datareader object having 7 hundererd thousands plus record. It returns the datareader without an exception but when i am iterating through the datareader to take its values to a string bulider type varible to write a csv file, it throws an 'System.OutOfMemoryException' due to huge records. I can not filter the record as it is required to write a csv file of the whole data. how can i avoide this error and improve the efficency of the system so that it takes less resources. The compilation debug is set to false. Also when there a few hundered thousand records the download is working ok. Thanks in advance for the help form any esteemed member. Thanks, Safvi

    K L 2 Replies Last reply
    0
    • K K Safvi

      Dear All, In download data module of my application(with asp.net2.0/sqlserver2005)one of the function is returning a datareader object having 7 hundererd thousands plus record. It returns the datareader without an exception but when i am iterating through the datareader to take its values to a string bulider type varible to write a csv file, it throws an 'System.OutOfMemoryException' due to huge records. I can not filter the record as it is required to write a csv file of the whole data. how can i avoide this error and improve the efficency of the system so that it takes less resources. The compilation debug is set to false. Also when there a few hundered thousand records the download is working ok. Thanks in advance for the help form any esteemed member. Thanks, Safvi

      K Offline
      K Offline
      K Safvi
      wrote on last edited by
      #2

      public static string WriteCSVFilebyDataReader(SqlDataReader SqlDataReaderObject) { StringBuilder strResult = new StringBuilder(); try { for (int i = 0; i < SqlDataReaderObject.FieldCount; i++) { strResult.Append("\" " + SqlDataReaderObject.GetName(i) + " \","); } strResult.Append("\n"); if (SqlDataReaderObject.HasRows) { while (SqlDataReaderObject.Read()) { for (int j = 0; j < SqlDataReaderObject.FieldCount; j++) { strResult.Append("\"" +SqlDataReaderObject.GetValue(j).ToString() + "\","); } strResult.Append("\n"); //SqlDataReaderObject.NextResult(); } } } catch (Exception ex) { throw ex; } return strResult.ToString(); }

      1 Reply Last reply
      0
      • K K Safvi

        Dear All, In download data module of my application(with asp.net2.0/sqlserver2005)one of the function is returning a datareader object having 7 hundererd thousands plus record. It returns the datareader without an exception but when i am iterating through the datareader to take its values to a string bulider type varible to write a csv file, it throws an 'System.OutOfMemoryException' due to huge records. I can not filter the record as it is required to write a csv file of the whole data. how can i avoide this error and improve the efficency of the system so that it takes less resources. The compilation debug is set to false. Also when there a few hundered thousand records the download is working ok. Thanks in advance for the help form any esteemed member. Thanks, Safvi

        L Offline
        L Offline
        Lost User
        wrote on last edited by
        #3

        When you use a datareader, data is not fetched from the database until you use the Read() method to read the next row, therefore you may not receive an error when you open the datareader. The problem seems to be with StringBuilder. You can split the result into batches of say 10000 rows and then write them. For example, open the csv file write 10000 rows, close it, then append the next 10000 rows and so on. You could do this using a loop counter. Each batch would use a new StringBuilder instance to write to the csv file.

        1 Reply Last reply
        0
        Reply
        • Reply as topic
        Log in to reply
        • Oldest to Newest
        • Newest to Oldest
        • Most Votes


        • Login

        • Don't have an account? Register

        • Login or register to search.
        • First post
          Last post
        0
        • Categories
        • Recent
        • Tags
        • Popular
        • World
        • Users
        • Groups