File.Open method for large files
-
Hello! I loking for some solution to get a stream from large files. I writing an application witch uses SharpZipLib to create .zip archives. I use File.Open to get a filestream , and then giv it to the component, witch writes to the zip stream. It works great, but in bigger files the app use a great amount of system memory. I don't know how can I make it better, how can I make som keind of pipeline, or list from the stream... Can someone have an idea? Thanks for help!
-
Hello! I loking for some solution to get a stream from large files. I writing an application witch uses SharpZipLib to create .zip archives. I use File.Open to get a filestream , and then giv it to the component, witch writes to the zip stream. It works great, but in bigger files the app use a great amount of system memory. I don't know how can I make it better, how can I make som keind of pipeline, or list from the stream... Can someone have an idea? Thanks for help!
A file stream doesn't use any more memory if you open a large file. It's probably the compression component that uses the memory. I don't know if that specific library supports this, but speaking generally about zip compression: The amount of memory the compression uses depends on the compression rate selected. If you select a lower compression rate it should use much less memory. The difference in file size between different compression rates are usually quite small. --- b { font-weight: normal; }
-
A file stream doesn't use any more memory if you open a large file. It's probably the compression component that uses the memory. I don't know if that specific library supports this, but speaking generally about zip compression: The amount of memory the compression uses depends on the compression rate selected. If you select a lower compression rate it should use much less memory. The difference in file size between different compression rates are usually quite small. --- b { font-weight: normal; }
Thanks for the answer! I try change the compression level (currently is 6, and the maximum is 9). But I don't understand, why is the compression level changes memory usage, I think bigger level eat mutch CPU time, but memory... So you said File.Open method handle the large file problem? And what about reading time?
-
Thanks for the answer! I try change the compression level (currently is 6, and the maximum is 9). But I don't understand, why is the compression level changes memory usage, I think bigger level eat mutch CPU time, but memory... So you said File.Open method handle the large file problem? And what about reading time?
-
Thanks for the answer! I try change the compression level (currently is 6, and the maximum is 9). But I don't understand, why is the compression level changes memory usage, I think bigger level eat mutch CPU time, but memory... So you said File.Open method handle the large file problem? And what about reading time?
nemopeti wrote:
So you said File.Open method handle the large file problem?
Yes. A stream uses a buffer to read a small part of the file at a time. The default size for the buffer is 4096 bytes, so that's certainly not the cause of the memory consumption.
nemopeti wrote:
And what about reading time?
What about it? --- b { font-weight: normal; }
-
nemopeti wrote:
So you said File.Open method handle the large file problem?
Yes. A stream uses a buffer to read a small part of the file at a time. The default size for the buffer is 4096 bytes, so that's certainly not the cause of the memory consumption.
nemopeti wrote:
And what about reading time?
What about it? --- b { font-weight: normal; }
Guffa wrote:
nemopeti wrote: So you said File.Open method handle the large file problem? Yes. A stream uses a buffer to read a small part of the file at a time. The default size for the buffer is 4096 bytes, so that's certainly not the cause of the memory consumption.
but for a compressed file, the dictionary needs to be in memory until the extraction is completed.
-
Guffa wrote:
nemopeti wrote: So you said File.Open method handle the large file problem? Yes. A stream uses a buffer to read a small part of the file at a time. The default size for the buffer is 4096 bytes, so that's certainly not the cause of the memory consumption.
but for a compressed file, the dictionary needs to be in memory until the extraction is completed.