Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. how to clear memory

how to clear memory

Scheduled Pinned Locked Moved C#
helpdata-structuresperformancetutorialquestion
20 Posts 9 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • R Offline
    R Offline
    Ronenb
    wrote on last edited by
    #1

    Hi I’ve extremely log file that I need to parse, 700M I need to read the file content and parse each line, The problem is that I get exception System.outofmemory In order to solved the memory issue, I think of reading the file in pages I think of defile pageSize to be 100,000 lines In order clean memory of array such as fileLinesContent, I think of defining it as null or to call Array.Clean(fileLinesContent) Is it correct approach to clean memory of array? string []fileLinesContent; internal void ReadFile(int pageSize) { //To clean memory for garbage collector fileLinesContent=null; List fileLines = new List(); int linesNumRead=0; string line; while (linesNumRead < pageSize && !srLog.EndOfStream ) { line = srLog.ReadLine(); linesNumRead ++; } fileLinesContent = fileLines.ToArray(); } }

    L B E P R 5 Replies Last reply
    0
    • R Ronenb

      Hi I’ve extremely log file that I need to parse, 700M I need to read the file content and parse each line, The problem is that I get exception System.outofmemory In order to solved the memory issue, I think of reading the file in pages I think of defile pageSize to be 100,000 lines In order clean memory of array such as fileLinesContent, I think of defining it as null or to call Array.Clean(fileLinesContent) Is it correct approach to clean memory of array? string []fileLinesContent; internal void ReadFile(int pageSize) { //To clean memory for garbage collector fileLinesContent=null; List fileLines = new List(); int linesNumRead=0; string line; while (linesNumRead < pageSize && !srLog.EndOfStream ) { line = srLog.ReadLine(); linesNumRead ++; } fileLinesContent = fileLines.ToArray(); } }

      L Offline
      L Offline
      Lost User
      wrote on last edited by
      #2

      Ronenb wrote:

      Is it correct approach to clean memory of array?

      Once the array is out of scope, the garbage-cleaner will reclaim that memory. Might not do it immediate, but when it thinks that it's required. What are you going to do with the array of lines? Maybe there's a better solution, like passing a stream.

      Bastard Programmer from Hell :suss: if you can't read my code, try converting it here[^]

      R 1 Reply Last reply
      0
      • L Lost User

        Ronenb wrote:

        Is it correct approach to clean memory of array?

        Once the array is out of scope, the garbage-cleaner will reclaim that memory. Might not do it immediate, but when it thinks that it's required. What are you going to do with the array of lines? Maybe there's a better solution, like passing a stream.

        Bastard Programmer from Hell :suss: if you can't read my code, try converting it here[^]

        R Offline
        R Offline
        Ronenb
        wrote on last edited by
        #3

        hi the array (string []fileLinesContent) is a member of the class, not local member in func, so my question is how to clean it in order free memory and read next lines in files i'm reading the lines because i need to know when the line end and parse the element in each line ronen

        L 1 Reply Last reply
        0
        • R Ronenb

          hi the array (string []fileLinesContent) is a member of the class, not local member in func, so my question is how to clean it in order free memory and read next lines in files i'm reading the lines because i need to know when the line end and parse the element in each line ronen

          L Offline
          L Offline
          Lost User
          wrote on last edited by
          #4

          Ronenb wrote:

          the array (string []fileLinesContent) is a member of the class, not local member in func,

          Then it's memory will be released once the object goes out of scope.

          Ronenb wrote:

          so my question is how to clean it in order free memory and read next lines in files

          You can't dispose memory manually; there's a garbage-collector to do so. Very long strings are moved to the Large Object Heap, and simply stay there.

          Ronenb wrote:

          i'm reading the lines because i need to know when the line end and parse the element in each line

          I'd recommend reading from a Stream, until you encounter that newline-character, and process it immediately. There's little need to keep all that information in memory.

          Bastard Programmer from Hell :suss: if you can't read my code, try converting it here[^]

          R 1 Reply Last reply
          0
          • R Ronenb

            Hi I’ve extremely log file that I need to parse, 700M I need to read the file content and parse each line, The problem is that I get exception System.outofmemory In order to solved the memory issue, I think of reading the file in pages I think of defile pageSize to be 100,000 lines In order clean memory of array such as fileLinesContent, I think of defining it as null or to call Array.Clean(fileLinesContent) Is it correct approach to clean memory of array? string []fileLinesContent; internal void ReadFile(int pageSize) { //To clean memory for garbage collector fileLinesContent=null; List fileLines = new List(); int linesNumRead=0; string line; while (linesNumRead < pageSize && !srLog.EndOfStream ) { line = srLog.ReadLine(); linesNumRead ++; } fileLinesContent = fileLines.ToArray(); } }

            B Offline
            B Offline
            BobJanova
            wrote on last edited by
            #5

            700MB is not that large, you shouldn't run out of memory if you process it efficiently, even if you store the whole thing in memory. Calling List.ToArray is going to double the memory usage, though; you should either store it as a List all the time, or read it into an array to begin with (almost certainly the former). However, I suspect you are doing some streaming task and you don't actually need the whole thing. Parse each line as it comes, and don't store it; instead store whatever information about a line you need to know, if anything.

            1 Reply Last reply
            0
            • L Lost User

              Ronenb wrote:

              the array (string []fileLinesContent) is a member of the class, not local member in func,

              Then it's memory will be released once the object goes out of scope.

              Ronenb wrote:

              so my question is how to clean it in order free memory and read next lines in files

              You can't dispose memory manually; there's a garbage-collector to do so. Very long strings are moved to the Large Object Heap, and simply stay there.

              Ronenb wrote:

              i'm reading the lines because i need to know when the line end and parse the element in each line

              I'd recommend reading from a Stream, until you encounter that newline-character, and process it immediately. There's little need to keep all that information in memory.

              Bastard Programmer from Hell :suss: if you can't read my code, try converting it here[^]

              R Offline
              R Offline
              Ronenb
              wrote on last edited by
              #6

              Thank u all for the replay i think it is better to use the stream option as you suggested can you send me example how to stream a file for each line? should i use StreamFile, StreamRead or MemoryStream service? thanks ronen

              L E 2 Replies Last reply
              0
              • R Ronenb

                Thank u all for the replay i think it is better to use the stream option as you suggested can you send me example how to stream a file for each line? should i use StreamFile, StreamRead or MemoryStream service? thanks ronen

                L Offline
                L Offline
                Lost User
                wrote on last edited by
                #7

                Ronenb wrote:

                should i use StreamFile, StreamRead or MemoryStream service?

                A FileStream[^] would be the most appropriate. Next, you use a StreamReader, which has a [ReadLine](http://msdn.microsoft.com/en-us/library/system.io.streamreader.readline.aspx)[[^](http://msdn.microsoft.com/en-us/library/system.io.streamreader.readline.aspx "New Window")] method.

                Ronenb wrote:

                can you send me example how to stream a file for each line?

                See the links in this post, I don't provide copy/paste answers.

                Bastard Programmer from Hell :suss: if you can't read my code, try converting it here[^]

                P R 2 Replies Last reply
                0
                • L Lost User

                  Ronenb wrote:

                  should i use StreamFile, StreamRead or MemoryStream service?

                  A FileStream[^] would be the most appropriate. Next, you use a StreamReader, which has a [ReadLine](http://msdn.microsoft.com/en-us/library/system.io.streamreader.readline.aspx)[[^](http://msdn.microsoft.com/en-us/library/system.io.streamreader.readline.aspx "New Window")] method.

                  Ronenb wrote:

                  can you send me example how to stream a file for each line?

                  See the links in this post, I don't provide copy/paste answers.

                  Bastard Programmer from Hell :suss: if you can't read my code, try converting it here[^]

                  P Offline
                  P Offline
                  Pete OHanlon
                  wrote on last edited by
                  #8

                  Are you sure you don't deliberately sometimes supply copy/paste answers that are for the wrong question?

                  *pre-emptive celebratory nipple tassle jiggle* - Sean Ewington

                  "Mind bleach! Send me mind bleach!" - Nagy Vilmos

                  CodeStash - Online Snippet Management | My blog | MoXAML PowerToys | Mole 2010 - debugging made easier

                  L 1 Reply Last reply
                  0
                  • P Pete OHanlon

                    Are you sure you don't deliberately sometimes supply copy/paste answers that are for the wrong question?

                    *pre-emptive celebratory nipple tassle jiggle* - Sean Ewington

                    "Mind bleach! Send me mind bleach!" - Nagy Vilmos

                    CodeStash - Online Snippet Management | My blog | MoXAML PowerToys | Mole 2010 - debugging made easier

                    L Offline
                    L Offline
                    Lost User
                    wrote on last edited by
                    #9

                    Failed to parse your question; can you slice it into simpler pieces for me? To try and answer that; yes, I do sometimes post code that can be copied/pasted. Yes, I also supply the wrong answer sometimes. Did I miss anything?

                    Bastard Programmer from Hell :suss: if you can't read my code, try converting it here[^]

                    P 1 Reply Last reply
                    0
                    • R Ronenb

                      Thank u all for the replay i think it is better to use the stream option as you suggested can you send me example how to stream a file for each line? should i use StreamFile, StreamRead or MemoryStream service? thanks ronen

                      E Offline
                      E Offline
                      Ed Hill _5_
                      wrote on last edited by
                      #10

                      this is a console app that may do what you are after, replace the console.writeline with a call to your parse method

                      class Program
                      {
                          static void Main(string\[\] args)
                          {
                              string filePath = @"D:\\ReadMe.txt";
                              using (var streamReader = new System.IO.StreamReader(filePath))
                              {
                                  while (!streamReader.EndOfStream)
                                  {
                                      var toParse = streamReader.ReadLine();
                                      System.Console.WriteLine(toParse);
                                  }
                              }
                              System.Console.ReadLine();
                          }
                      }
                      
                      R 1 Reply Last reply
                      0
                      • L Lost User

                        Ronenb wrote:

                        should i use StreamFile, StreamRead or MemoryStream service?

                        A FileStream[^] would be the most appropriate. Next, you use a StreamReader, which has a [ReadLine](http://msdn.microsoft.com/en-us/library/system.io.streamreader.readline.aspx)[[^](http://msdn.microsoft.com/en-us/library/system.io.streamreader.readline.aspx "New Window")] method.

                        Ronenb wrote:

                        can you send me example how to stream a file for each line?

                        See the links in this post, I don't provide copy/paste answers.

                        Bastard Programmer from Hell :suss: if you can't read my code, try converting it here[^]

                        R Offline
                        R Offline
                        Ronenb
                        wrote on last edited by
                        #11

                        thanks

                        1 Reply Last reply
                        0
                        • L Lost User

                          Failed to parse your question; can you slice it into simpler pieces for me? To try and answer that; yes, I do sometimes post code that can be copied/pasted. Yes, I also supply the wrong answer sometimes. Did I miss anything?

                          Bastard Programmer from Hell :suss: if you can't read my code, try converting it here[^]

                          P Offline
                          P Offline
                          Pete OHanlon
                          wrote on last edited by
                          #12

                          Eddy Vluggen wrote:

                          Did I miss anything?

                          Yes. I had suggested that depending on how stroppy the OP had been, you could deliberately supply the wrong piece of code. Not that I'm advocating this of course. Oh no. Definitely not. Noway. Nosirree.

                          *pre-emptive celebratory nipple tassle jiggle* - Sean Ewington

                          "Mind bleach! Send me mind bleach!" - Nagy Vilmos

                          CodeStash - Online Snippet Management | My blog | MoXAML PowerToys | Mole 2010 - debugging made easier

                          L 1 Reply Last reply
                          0
                          • E Ed Hill _5_

                            this is a console app that may do what you are after, replace the console.writeline with a call to your parse method

                            class Program
                            {
                                static void Main(string\[\] args)
                                {
                                    string filePath = @"D:\\ReadMe.txt";
                                    using (var streamReader = new System.IO.StreamReader(filePath))
                                    {
                                        while (!streamReader.EndOfStream)
                                        {
                                            var toParse = streamReader.ReadLine();
                                            System.Console.WriteLine(toParse);
                                        }
                                    }
                                    System.Console.ReadLine();
                                }
                            }
                            
                            R Offline
                            R Offline
                            Ronenb
                            wrote on last edited by
                            #13

                            ok that this is Basically what i did srLog is type of StreamReader so my problem was that i didint need to read all file content to memory and then process it :)

                            I was thinking it is more efficient to read all file content (with one stroke) and work on it then to access file system for each line(ReadLine) and process it
                            Isn’t more expensive?

                                      while (!srLog.EndOfStream)
                                       {
                                           line = srLog.ReadLine();
                                                   fileLines.Add(line);
                                       }
                                       fileLinesContent = fileLines.ToArray();
                            
                            E D 2 Replies Last reply
                            0
                            • R Ronenb

                              ok that this is Basically what i did srLog is type of StreamReader so my problem was that i didint need to read all file content to memory and then process it :)

                              I was thinking it is more efficient to read all file content (with one stroke) and work on it then to access file system for each line(ReadLine) and process it
                              Isn’t more expensive?

                                        while (!srLog.EndOfStream)
                                         {
                                             line = srLog.ReadLine();
                                                     fileLines.Add(line);
                                         }
                                         fileLinesContent = fileLines.ToArray();
                              
                              E Offline
                              E Offline
                              Ed Hill _5_
                              wrote on last edited by
                              #14

                              Being honest i don't know which way would perform better, your origional post involved calling ReadLine many times in a loop any way and you were getting memory issues. This way should not be less efficient than the origional proposed solution. I'd suggest you try it with something many times larger than your expected log file size, if the performance is good then don't stress too much about how you could make it better. If the file IO part of this App is not the bottle neck in the process then don't lose too much time optimising it.

                              1 Reply Last reply
                              0
                              • R Ronenb

                                ok that this is Basically what i did srLog is type of StreamReader so my problem was that i didint need to read all file content to memory and then process it :)

                                I was thinking it is more efficient to read all file content (with one stroke) and work on it then to access file system for each line(ReadLine) and process it
                                Isn’t more expensive?

                                          while (!srLog.EndOfStream)
                                           {
                                               line = srLog.ReadLine();
                                                       fileLines.Add(line);
                                           }
                                           fileLinesContent = fileLines.ToArray();
                                
                                D Offline
                                D Offline
                                Dave Kreskowiak
                                wrote on last edited by
                                #15

                                No, it's not. The way you were reading the file (line by line) and storing it in memory is no less expensive than processing that giant log one line at a time. You're trying to read 700MB of data into memory and running into OutOfMemory problems. How efficient do you think that is?? By reading everything into memory all at once your solution only works on limited log sizes, dependant on system memory. If you process every line, one at a time, without reading the entire file into memory, you can process log files of ANY size, up to the file size limit of the operating system and do it without requiring the machine to have terabytes of memory.

                                A guide to posting questions on CodeProject[^]
                                Dave Kreskowiak

                                R 1 Reply Last reply
                                0
                                • R Ronenb

                                  Hi I’ve extremely log file that I need to parse, 700M I need to read the file content and parse each line, The problem is that I get exception System.outofmemory In order to solved the memory issue, I think of reading the file in pages I think of defile pageSize to be 100,000 lines In order clean memory of array such as fileLinesContent, I think of defining it as null or to call Array.Clean(fileLinesContent) Is it correct approach to clean memory of array? string []fileLinesContent; internal void ReadFile(int pageSize) { //To clean memory for garbage collector fileLinesContent=null; List fileLines = new List(); int linesNumRead=0; string line; while (linesNumRead < pageSize && !srLog.EndOfStream ) { line = srLog.ReadLine(); linesNumRead ++; } fileLinesContent = fileLines.ToArray(); } }

                                  E Offline
                                  E Offline
                                  Ennis Ray Lynch Jr
                                  wrote on last edited by
                                  #16

                                  An array is a contiguous block of memory. Other than the fact that for large files all processing should be done in your while loop without retaining any of the file in memory your ToArray call could be killing you. If you need Random Access to the File, look into Memory Mapped Files.

                                  Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. I also do Android Programming as I find it a refreshing break from the MS. "And they, since they Were not the one dead, turned to their affairs" -- Robert Frost

                                  1 Reply Last reply
                                  0
                                  • P Pete OHanlon

                                    Eddy Vluggen wrote:

                                    Did I miss anything?

                                    Yes. I had suggested that depending on how stroppy the OP had been, you could deliberately supply the wrong piece of code. Not that I'm advocating this of course. Oh no. Definitely not. Noway. Nosirree.

                                    *pre-emptive celebratory nipple tassle jiggle* - Sean Ewington

                                    "Mind bleach! Send me mind bleach!" - Nagy Vilmos

                                    CodeStash - Online Snippet Management | My blog | MoXAML PowerToys | Mole 2010 - debugging made easier

                                    L Offline
                                    L Offline
                                    Lost User
                                    wrote on last edited by
                                    #17

                                    :-D

                                    1 Reply Last reply
                                    0
                                    • R Ronenb

                                      Hi I’ve extremely log file that I need to parse, 700M I need to read the file content and parse each line, The problem is that I get exception System.outofmemory In order to solved the memory issue, I think of reading the file in pages I think of defile pageSize to be 100,000 lines In order clean memory of array such as fileLinesContent, I think of defining it as null or to call Array.Clean(fileLinesContent) Is it correct approach to clean memory of array? string []fileLinesContent; internal void ReadFile(int pageSize) { //To clean memory for garbage collector fileLinesContent=null; List fileLines = new List(); int linesNumRead=0; string line; while (linesNumRead < pageSize && !srLog.EndOfStream ) { line = srLog.ReadLine(); linesNumRead ++; } fileLinesContent = fileLines.ToArray(); } }

                                      P Offline
                                      P Offline
                                      PIEBALDconsult
                                      wrote on last edited by
                                      #18

                                      I'll repeat that you probably shouldn't be keeping a bunch of rows in memory. However, if you really need to, I suggest defining a class to hold an entry once it has been parsed. And don't use an array; use a List or a Queue, or if you're passing it to another class, consider using an event.

                                      1 Reply Last reply
                                      0
                                      • R Ronenb

                                        Hi I’ve extremely log file that I need to parse, 700M I need to read the file content and parse each line, The problem is that I get exception System.outofmemory In order to solved the memory issue, I think of reading the file in pages I think of defile pageSize to be 100,000 lines In order clean memory of array such as fileLinesContent, I think of defining it as null or to call Array.Clean(fileLinesContent) Is it correct approach to clean memory of array? string []fileLinesContent; internal void ReadFile(int pageSize) { //To clean memory for garbage collector fileLinesContent=null; List fileLines = new List(); int linesNumRead=0; string line; while (linesNumRead < pageSize && !srLog.EndOfStream ) { line = srLog.ReadLine(); linesNumRead ++; } fileLinesContent = fileLines.ToArray(); } }

                                        R Offline
                                        R Offline
                                        RobCroll
                                        wrote on last edited by
                                        #19

                                        You already know how many lines are in the file using the pageSize argument. Interesting as it suggests the file has already been read at least once. There may be memory issues there. Can't tell what object types fileLinesContent or fileLines are but Arrays, ArrayLists and List enable you to set the collections capacity. Setting the capacity, when initializing will potentially save memory. Why not make fileLines the same type as fileLinesContent and then you can drop the ToArray() method. As already mentioned, 700M isn't a big deal, there must be another part to this problem unless you are working on an old PC.

                                        "You get that on the big jobs."

                                        1 Reply Last reply
                                        0
                                        • D Dave Kreskowiak

                                          No, it's not. The way you were reading the file (line by line) and storing it in memory is no less expensive than processing that giant log one line at a time. You're trying to read 700MB of data into memory and running into OutOfMemory problems. How efficient do you think that is?? By reading everything into memory all at once your solution only works on limited log sizes, dependant on system memory. If you process every line, one at a time, without reading the entire file into memory, you can process log files of ANY size, up to the file size limit of the operating system and do it without requiring the machine to have terabytes of memory.

                                          A guide to posting questions on CodeProject[^]
                                          Dave Kreskowiak

                                          R Offline
                                          R Offline
                                          Ronenb
                                          wrote on last edited by
                                          #20

                                          Got it :) thanks all Ronen

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups