Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. File.Open method for large files

File.Open method for large files

Scheduled Pinned Locked Moved C#
questionperformancehelp
7 Posts 3 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • N Offline
    N Offline
    nemopeti
    wrote on last edited by
    #1

    Hello! I loking for some solution to get a stream from large files. I writing an application witch uses SharpZipLib to create .zip archives. I use File.Open to get a filestream , and then giv it to the component, witch writes to the zip stream. It works great, but in bigger files the app use a great amount of system memory. I don't know how can I make it better, how can I make som keind of pipeline, or list from the stream... Can someone have an idea? Thanks for help!

    G 1 Reply Last reply
    0
    • N nemopeti

      Hello! I loking for some solution to get a stream from large files. I writing an application witch uses SharpZipLib to create .zip archives. I use File.Open to get a filestream , and then giv it to the component, witch writes to the zip stream. It works great, but in bigger files the app use a great amount of system memory. I don't know how can I make it better, how can I make som keind of pipeline, or list from the stream... Can someone have an idea? Thanks for help!

      G Offline
      G Offline
      Guffa
      wrote on last edited by
      #2

      A file stream doesn't use any more memory if you open a large file. It's probably the compression component that uses the memory. I don't know if that specific library supports this, but speaking generally about zip compression: The amount of memory the compression uses depends on the compression rate selected. If you select a lower compression rate it should use much less memory. The difference in file size between different compression rates are usually quite small. --- b { font-weight: normal; }

      N 1 Reply Last reply
      0
      • G Guffa

        A file stream doesn't use any more memory if you open a large file. It's probably the compression component that uses the memory. I don't know if that specific library supports this, but speaking generally about zip compression: The amount of memory the compression uses depends on the compression rate selected. If you select a lower compression rate it should use much less memory. The difference in file size between different compression rates are usually quite small. --- b { font-weight: normal; }

        N Offline
        N Offline
        nemopeti
        wrote on last edited by
        #3

        Thanks for the answer! I try change the compression level (currently is 6, and the maximum is 9). But I don't understand, why is the compression level changes memory usage, I think bigger level eat mutch CPU time, but memory... So you said File.Open method handle the large file problem? And what about reading time?

        D G 2 Replies Last reply
        0
        • N nemopeti

          Thanks for the answer! I try change the compression level (currently is 6, and the maximum is 9). But I don't understand, why is the compression level changes memory usage, I think bigger level eat mutch CPU time, but memory... So you said File.Open method handle the large file problem? And what about reading time?

          D Offline
          D Offline
          Dan Neely
          wrote on last edited by
          #4

          Zip files work by creating a 'dictionary' where each bit string in the dictionary stands for a longer bitstring in the compressed file. Higher compression levels create a larger dictionary length.

          1 Reply Last reply
          0
          • N nemopeti

            Thanks for the answer! I try change the compression level (currently is 6, and the maximum is 9). But I don't understand, why is the compression level changes memory usage, I think bigger level eat mutch CPU time, but memory... So you said File.Open method handle the large file problem? And what about reading time?

            G Offline
            G Offline
            Guffa
            wrote on last edited by
            #5

            nemopeti wrote:

            So you said File.Open method handle the large file problem?

            Yes. A stream uses a buffer to read a small part of the file at a time. The default size for the buffer is 4096 bytes, so that's certainly not the cause of the memory consumption.

            nemopeti wrote:

            And what about reading time?

            What about it? --- b { font-weight: normal; }

            D 1 Reply Last reply
            0
            • G Guffa

              nemopeti wrote:

              So you said File.Open method handle the large file problem?

              Yes. A stream uses a buffer to read a small part of the file at a time. The default size for the buffer is 4096 bytes, so that's certainly not the cause of the memory consumption.

              nemopeti wrote:

              And what about reading time?

              What about it? --- b { font-weight: normal; }

              D Offline
              D Offline
              Dan Neely
              wrote on last edited by
              #6

              Guffa wrote:

              nemopeti wrote: So you said File.Open method handle the large file problem? Yes. A stream uses a buffer to read a small part of the file at a time. The default size for the buffer is 4096 bytes, so that's certainly not the cause of the memory consumption.

              but for a compressed file, the dictionary needs to be in memory until the extraction is completed.

              G 1 Reply Last reply
              0
              • D Dan Neely

                Guffa wrote:

                nemopeti wrote: So you said File.Open method handle the large file problem? Yes. A stream uses a buffer to read a small part of the file at a time. The default size for the buffer is 4096 bytes, so that's certainly not the cause of the memory consumption.

                but for a compressed file, the dictionary needs to be in memory until the extraction is completed.

                G Offline
                G Offline
                Guffa
                wrote on last edited by
                #7

                Yes, the compression uses a lot of memory. The file stream doesn't. --- b { font-weight: normal; }

                1 Reply Last reply
                0
                Reply
                • Reply as topic
                Log in to reply
                • Oldest to Newest
                • Newest to Oldest
                • Most Votes


                • Login

                • Don't have an account? Register

                • Login or register to search.
                • First post
                  Last post
                0
                • Categories
                • Recent
                • Tags
                • Popular
                • World
                • Users
                • Groups