Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. Deleting Read Bytes

Deleting Read Bytes

Scheduled Pinned Locked Moved C#
helpperformancetutorialquestion
6 Posts 3 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C Offline
    C Offline
    computerpublic
    wrote on last edited by
    #1

    /*To simplify the explanation of the problem, i am using a very small portion of the code which the problem occurs. While attempting to read bytes from a large file, I am getting OutOfMemory errors. I don't want to buffer, store or save the bytes I am reading. I simply want to read each byte, then discard it completely (to free up the memory and prevent getting the error) and then read the next byte and so on to the end. I am not sure of HOW OR WHERE to to do my deletion of the bytes after they are read. I have been reading about "flush" but I am not sure of how to use it or even if it will work. Can someone please assist?*/
    using System;
    using System.IO;
    using System.Collections;

    namespace Applica
    {
    class Program
    {
    static void Main(string[] args)
    {
    DirectoryInfo da = new DirectoryInfo("C:\\Folder");
    FileInfo[] Arr = da.GetFiles();
    if (Arr.Length == 0)
    {
    throw new InvalidOperationException("No files found.");
    }
    FileInfo ap = Arr[Arr.Length - 1];
    long Totbyte = ap.Length;
    string filePath = ap.FullName;
    string temPath = Path.GetTempFileName();
    byte[] data = File.ReadAllBytes(filePath);
    //File.WriteAllBytes(temPath, data);
    }
    }
    }

    OriginalGriffO L 2 Replies Last reply
    0
    • C computerpublic

      /*To simplify the explanation of the problem, i am using a very small portion of the code which the problem occurs. While attempting to read bytes from a large file, I am getting OutOfMemory errors. I don't want to buffer, store or save the bytes I am reading. I simply want to read each byte, then discard it completely (to free up the memory and prevent getting the error) and then read the next byte and so on to the end. I am not sure of HOW OR WHERE to to do my deletion of the bytes after they are read. I have been reading about "flush" but I am not sure of how to use it or even if it will work. Can someone please assist?*/
      using System;
      using System.IO;
      using System.Collections;

      namespace Applica
      {
      class Program
      {
      static void Main(string[] args)
      {
      DirectoryInfo da = new DirectoryInfo("C:\\Folder");
      FileInfo[] Arr = da.GetFiles();
      if (Arr.Length == 0)
      {
      throw new InvalidOperationException("No files found.");
      }
      FileInfo ap = Arr[Arr.Length - 1];
      long Totbyte = ap.Length;
      string filePath = ap.FullName;
      string temPath = Path.GetTempFileName();
      byte[] data = File.ReadAllBytes(filePath);
      //File.WriteAllBytes(temPath, data);
      }
      }
      }

      OriginalGriffO Offline
      OriginalGriffO Offline
      OriginalGriff
      wrote on last edited by
      #2

      The ReadAllBytes method does exactly that: it reads the entire file content into the memory and returns it as an array of bytes. If you are getting "Out of memory" errors, then there are two main possibilities: 1) The size of any one file exceeds 2GB. There is an absolute limit on any .NET object of 2GB, no item (and an array of bytes is one item) can exceed this. 2) These are large files, and as such they will go on the Large Object Heap - anything larger than 85KB is a "large object" and as such goes on the LOH - which is not compacted, it's objects are "connected" when they are garbage collected, and it's filling up...but that should trigger a GC which should empty it so it shouldn't cause a problem. The bad news is...System.Array does not implement IDisposable so you can't free it yourself. You might be able to force it, by setting data to null and manually calling the GC, but that's a nasty solution. I would suggest that your best approach might be to "block read" your files: 64K lumps for example so that any allocation problem is reduced.

      Those who fail to learn history are doomed to repeat it. --- George Santayana (December 16, 1863 – September 26, 1952) Those who fail to clear history are doomed to explain it. --- OriginalGriff (February 24, 1959 – ∞)

      "I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
      "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt

      C 1 Reply Last reply
      0
      • OriginalGriffO OriginalGriff

        The ReadAllBytes method does exactly that: it reads the entire file content into the memory and returns it as an array of bytes. If you are getting "Out of memory" errors, then there are two main possibilities: 1) The size of any one file exceeds 2GB. There is an absolute limit on any .NET object of 2GB, no item (and an array of bytes is one item) can exceed this. 2) These are large files, and as such they will go on the Large Object Heap - anything larger than 85KB is a "large object" and as such goes on the LOH - which is not compacted, it's objects are "connected" when they are garbage collected, and it's filling up...but that should trigger a GC which should empty it so it shouldn't cause a problem. The bad news is...System.Array does not implement IDisposable so you can't free it yourself. You might be able to force it, by setting data to null and manually calling the GC, but that's a nasty solution. I would suggest that your best approach might be to "block read" your files: 64K lumps for example so that any allocation problem is reduced.

        Those who fail to learn history are doomed to repeat it. --- George Santayana (December 16, 1863 – September 26, 1952) Those who fail to clear history are doomed to explain it. --- OriginalGriff (February 24, 1959 – ∞)

        C Offline
        C Offline
        computerpublic
        wrote on last edited by
        #3

        /*I using the code below only for a basic example. Being that I am reading large files, how could I modify this code to only do "block read" in lumps of 64k at a time?*/
        using System;
        using System.IO;
        using System.Collections;

        namespace Applica
        {
        class Program
        {
        static void Main(string[] args)
        {
        DirectoryInfo da = new DirectoryInfo("C:\\Folder");
        FileInfo[] Arr = da.GetFiles();
        if (Arr.Length == 0)
        {
        throw new InvalidOperationException("No files found.");
        }
        FileInfo ap = Arr[Arr.Length - 1];
        long Totbyte = ap.Length;
        string filePath = ap.FullName;
        string temPath = Path.GetTempFileName();
        byte[] data = File.ReadAllBytes(filePath);
        File.WriteAllBytes(temPath, data);
        decimal[] arry = new decimal[Totbyte];
        for (int count = 0; count < data.Length; count++)
        {
        arry[count] = data[count];
        }
        byte[] data2 = new byte[Totbyte];
        for (int count = 0; count < arry.Length; count++)
        {
        data2[count] = (byte)arry[count];
        }
        string filePath2 = Path.Combine("C:\\check", Path.GetFileName(filePath));
        File.WriteAllBytes(filePath2, data2);
        data = File.ReadAllBytes(temPath);
        data2 = File.ReadAllBytes(filePath);
        }
        }
        }

        OriginalGriffO 1 Reply Last reply
        0
        • C computerpublic

          /*I using the code below only for a basic example. Being that I am reading large files, how could I modify this code to only do "block read" in lumps of 64k at a time?*/
          using System;
          using System.IO;
          using System.Collections;

          namespace Applica
          {
          class Program
          {
          static void Main(string[] args)
          {
          DirectoryInfo da = new DirectoryInfo("C:\\Folder");
          FileInfo[] Arr = da.GetFiles();
          if (Arr.Length == 0)
          {
          throw new InvalidOperationException("No files found.");
          }
          FileInfo ap = Arr[Arr.Length - 1];
          long Totbyte = ap.Length;
          string filePath = ap.FullName;
          string temPath = Path.GetTempFileName();
          byte[] data = File.ReadAllBytes(filePath);
          File.WriteAllBytes(temPath, data);
          decimal[] arry = new decimal[Totbyte];
          for (int count = 0; count < data.Length; count++)
          {
          arry[count] = data[count];
          }
          byte[] data2 = new byte[Totbyte];
          for (int count = 0; count < arry.Length; count++)
          {
          data2[count] = (byte)arry[count];
          }
          string filePath2 = Path.Combine("C:\\check", Path.GetFileName(filePath));
          File.WriteAllBytes(filePath2, data2);
          data = File.ReadAllBytes(temPath);
          data2 = File.ReadAllBytes(filePath);
          }
          }
          }

          OriginalGriffO Offline
          OriginalGriffO Offline
          OriginalGriff
          wrote on last edited by
          #4

          ¡Ay, caramba! You do realize what that code does, don't you? Not only do you allocate two arrays of bytes the same size as each file in the folder, you also allocate a third array 16 times larger for each file as well! The decimal datatype is 128 bits wide - or 16 bytes! So is any file in your folder is bigger than 134MB, you will exceed the 2GB maximum-object limit...and get "out of memory". And frankly, that code doesn't make a whole lot of sense. You read the file as bytes, copy each byte value to a decimal (which does nothing in practice), and then convert each decimal back to a byte (which will give you exactly what you started with!) And then you write the data to a new place... You could do the same much more easily with File.Copy... :laugh:

          Those who fail to learn history are doomed to repeat it. --- George Santayana (December 16, 1863 – September 26, 1952) Those who fail to clear history are doomed to explain it. --- OriginalGriff (February 24, 1959 – ∞)

          "I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
          "Common sense is so rare these days, it should be classified as a super power" - Random T-shirt

          L 1 Reply Last reply
          0
          • C computerpublic

            /*To simplify the explanation of the problem, i am using a very small portion of the code which the problem occurs. While attempting to read bytes from a large file, I am getting OutOfMemory errors. I don't want to buffer, store or save the bytes I am reading. I simply want to read each byte, then discard it completely (to free up the memory and prevent getting the error) and then read the next byte and so on to the end. I am not sure of HOW OR WHERE to to do my deletion of the bytes after they are read. I have been reading about "flush" but I am not sure of how to use it or even if it will work. Can someone please assist?*/
            using System;
            using System.IO;
            using System.Collections;

            namespace Applica
            {
            class Program
            {
            static void Main(string[] args)
            {
            DirectoryInfo da = new DirectoryInfo("C:\\Folder");
            FileInfo[] Arr = da.GetFiles();
            if (Arr.Length == 0)
            {
            throw new InvalidOperationException("No files found.");
            }
            FileInfo ap = Arr[Arr.Length - 1];
            long Totbyte = ap.Length;
            string filePath = ap.FullName;
            string temPath = Path.GetTempFileName();
            byte[] data = File.ReadAllBytes(filePath);
            //File.WriteAllBytes(temPath, data);
            }
            }
            }

            L Offline
            L Offline
            Lost User
            wrote on last edited by
            #5

            This is a duplicate of http://www.codeproject.com/Messages/4770301/Invalid-Argument-in-For-Loop.aspx[^], where it was previously explained that what you are doing makes no sense and is just using memory to no puropse. Little wonder that it crashes. Once again, why not explain what problem you are trying to solve and maybe we can help you?

            1 Reply Last reply
            0
            • OriginalGriffO OriginalGriff

              ¡Ay, caramba! You do realize what that code does, don't you? Not only do you allocate two arrays of bytes the same size as each file in the folder, you also allocate a third array 16 times larger for each file as well! The decimal datatype is 128 bits wide - or 16 bytes! So is any file in your folder is bigger than 134MB, you will exceed the 2GB maximum-object limit...and get "out of memory". And frankly, that code doesn't make a whole lot of sense. You read the file as bytes, copy each byte value to a decimal (which does nothing in practice), and then convert each decimal back to a byte (which will give you exactly what you started with!) And then you write the data to a new place... You could do the same much more easily with File.Copy... :laugh:

              Those who fail to learn history are doomed to repeat it. --- George Santayana (December 16, 1863 – September 26, 1952) Those who fail to clear history are doomed to explain it. --- OriginalGriff (February 24, 1959 – ∞)

              L Offline
              L Offline
              Lost User
              wrote on last edited by
              #6

              See my comment below; I thought I recognised this code.

              1 Reply Last reply
              0
              Reply
              • Reply as topic
              Log in to reply
              • Oldest to Newest
              • Newest to Oldest
              • Most Votes


              • Login

              • Don't have an account? Register

              • Login or register to search.
              • First post
                Last post
              0
              • Categories
              • Recent
              • Tags
              • Popular
              • World
              • Users
              • Groups