how to clear memory
-
Thank u all for the replay i think it is better to use the stream option as you suggested can you send me example how to stream a file for each line? should i use StreamFile, StreamRead or MemoryStream service? thanks ronen
this is a console app that may do what you are after, replace the console.writeline with a call to your parse method
class Program { static void Main(string\[\] args) { string filePath = @"D:\\ReadMe.txt"; using (var streamReader = new System.IO.StreamReader(filePath)) { while (!streamReader.EndOfStream) { var toParse = streamReader.ReadLine(); System.Console.WriteLine(toParse); } } System.Console.ReadLine(); } }
-
Ronenb wrote:
should i use StreamFile, StreamRead or MemoryStream service?
A FileStream[^] would be the most appropriate. Next, you use a
StreamReader
, which has a[ReadLine](http://msdn.microsoft.com/en-us/library/system.io.streamreader.readline.aspx)[[^](http://msdn.microsoft.com/en-us/library/system.io.streamreader.readline.aspx "New Window")]
method.Ronenb wrote:
can you send me example how to stream a file for each line?
See the links in this post, I don't provide copy/paste answers.
Bastard Programmer from Hell :suss: if you can't read my code, try converting it here[^]
-
Failed to parse your question; can you slice it into simpler pieces for me? To try and answer that; yes, I do sometimes post code that can be copied/pasted. Yes, I also supply the wrong answer sometimes. Did I miss anything?
Bastard Programmer from Hell :suss: if you can't read my code, try converting it here[^]
Eddy Vluggen wrote:
Did I miss anything?
Yes. I had suggested that depending on how stroppy the OP had been, you could deliberately supply the wrong piece of code. Not that I'm advocating this of course. Oh no. Definitely not. Noway. Nosirree.
*pre-emptive celebratory nipple tassle jiggle* - Sean Ewington
"Mind bleach! Send me mind bleach!" - Nagy Vilmos
CodeStash - Online Snippet Management | My blog | MoXAML PowerToys | Mole 2010 - debugging made easier
-
this is a console app that may do what you are after, replace the console.writeline with a call to your parse method
class Program { static void Main(string\[\] args) { string filePath = @"D:\\ReadMe.txt"; using (var streamReader = new System.IO.StreamReader(filePath)) { while (!streamReader.EndOfStream) { var toParse = streamReader.ReadLine(); System.Console.WriteLine(toParse); } } System.Console.ReadLine(); } }
ok that this is Basically what i did srLog is type of StreamReader so my problem was that i didint need to read all file content to memory and then process it :)
I was thinking it is more efficient to read all file content (with one stroke) and work on it then to access file system for each line(ReadLine) and process it
Isn’t more expensive?while (!srLog.EndOfStream) { line = srLog.ReadLine(); fileLines.Add(line); } fileLinesContent = fileLines.ToArray();
-
ok that this is Basically what i did srLog is type of StreamReader so my problem was that i didint need to read all file content to memory and then process it :)
I was thinking it is more efficient to read all file content (with one stroke) and work on it then to access file system for each line(ReadLine) and process it
Isn’t more expensive?while (!srLog.EndOfStream) { line = srLog.ReadLine(); fileLines.Add(line); } fileLinesContent = fileLines.ToArray();
Being honest i don't know which way would perform better, your origional post involved calling ReadLine many times in a loop any way and you were getting memory issues. This way should not be less efficient than the origional proposed solution. I'd suggest you try it with something many times larger than your expected log file size, if the performance is good then don't stress too much about how you could make it better. If the file IO part of this App is not the bottle neck in the process then don't lose too much time optimising it.
-
ok that this is Basically what i did srLog is type of StreamReader so my problem was that i didint need to read all file content to memory and then process it :)
I was thinking it is more efficient to read all file content (with one stroke) and work on it then to access file system for each line(ReadLine) and process it
Isn’t more expensive?while (!srLog.EndOfStream) { line = srLog.ReadLine(); fileLines.Add(line); } fileLinesContent = fileLines.ToArray();
No, it's not. The way you were reading the file (line by line) and storing it in memory is no less expensive than processing that giant log one line at a time. You're trying to read 700MB of data into memory and running into OutOfMemory problems. How efficient do you think that is?? By reading everything into memory all at once your solution only works on limited log sizes, dependant on system memory. If you process every line, one at a time, without reading the entire file into memory, you can process log files of ANY size, up to the file size limit of the operating system and do it without requiring the machine to have terabytes of memory.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak -
Hi I’ve extremely log file that I need to parse, 700M I need to read the file content and parse each line, The problem is that I get exception System.outofmemory In order to solved the memory issue, I think of reading the file in pages I think of defile pageSize to be 100,000 lines In order clean memory of array such as fileLinesContent, I think of defining it as null or to call Array.Clean(fileLinesContent) Is it correct approach to clean memory of array? string []fileLinesContent; internal void ReadFile(int pageSize) { //To clean memory for garbage collector fileLinesContent=null; List fileLines = new List(); int linesNumRead=0; string line; while (linesNumRead < pageSize && !srLog.EndOfStream ) { line = srLog.ReadLine(); linesNumRead ++; } fileLinesContent = fileLines.ToArray(); } }
An array is a contiguous block of memory. Other than the fact that for large files all processing should be done in your while loop without retaining any of the file in memory your ToArray call could be killing you. If you need Random Access to the File, look into Memory Mapped Files.
Need custom software developed? I do custom programming based primarily on MS tools with an emphasis on C# development and consulting. I also do Android Programming as I find it a refreshing break from the MS. "And they, since they Were not the one dead, turned to their affairs" -- Robert Frost
-
Eddy Vluggen wrote:
Did I miss anything?
Yes. I had suggested that depending on how stroppy the OP had been, you could deliberately supply the wrong piece of code. Not that I'm advocating this of course. Oh no. Definitely not. Noway. Nosirree.
*pre-emptive celebratory nipple tassle jiggle* - Sean Ewington
"Mind bleach! Send me mind bleach!" - Nagy Vilmos
CodeStash - Online Snippet Management | My blog | MoXAML PowerToys | Mole 2010 - debugging made easier
-
Hi I’ve extremely log file that I need to parse, 700M I need to read the file content and parse each line, The problem is that I get exception System.outofmemory In order to solved the memory issue, I think of reading the file in pages I think of defile pageSize to be 100,000 lines In order clean memory of array such as fileLinesContent, I think of defining it as null or to call Array.Clean(fileLinesContent) Is it correct approach to clean memory of array? string []fileLinesContent; internal void ReadFile(int pageSize) { //To clean memory for garbage collector fileLinesContent=null; List fileLines = new List(); int linesNumRead=0; string line; while (linesNumRead < pageSize && !srLog.EndOfStream ) { line = srLog.ReadLine(); linesNumRead ++; } fileLinesContent = fileLines.ToArray(); } }
I'll repeat that you probably shouldn't be keeping a bunch of rows in memory. However, if you really need to, I suggest defining a class to hold an entry once it has been parsed. And don't use an array; use a List or a Queue, or if you're passing it to another class, consider using an event.
-
Hi I’ve extremely log file that I need to parse, 700M I need to read the file content and parse each line, The problem is that I get exception System.outofmemory In order to solved the memory issue, I think of reading the file in pages I think of defile pageSize to be 100,000 lines In order clean memory of array such as fileLinesContent, I think of defining it as null or to call Array.Clean(fileLinesContent) Is it correct approach to clean memory of array? string []fileLinesContent; internal void ReadFile(int pageSize) { //To clean memory for garbage collector fileLinesContent=null; List fileLines = new List(); int linesNumRead=0; string line; while (linesNumRead < pageSize && !srLog.EndOfStream ) { line = srLog.ReadLine(); linesNumRead ++; } fileLinesContent = fileLines.ToArray(); } }
You already know how many lines are in the file using the pageSize argument. Interesting as it suggests the file has already been read at least once. There may be memory issues there. Can't tell what object types fileLinesContent or fileLines are but Arrays, ArrayLists and List enable you to set the collections capacity. Setting the capacity, when initializing will potentially save memory. Why not make fileLines the same type as fileLinesContent and then you can drop the ToArray() method. As already mentioned, 700M isn't a big deal, there must be another part to this problem unless you are working on an old PC.
"You get that on the big jobs."
-
No, it's not. The way you were reading the file (line by line) and storing it in memory is no less expensive than processing that giant log one line at a time. You're trying to read 700MB of data into memory and running into OutOfMemory problems. How efficient do you think that is?? By reading everything into memory all at once your solution only works on limited log sizes, dependant on system memory. If you process every line, one at a time, without reading the entire file into memory, you can process log files of ANY size, up to the file size limit of the operating system and do it without requiring the machine to have terabytes of memory.
A guide to posting questions on CodeProject[^]
Dave Kreskowiak