Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. Hardware & Devices
  4. Slow ReadFile on DVD Drive

Slow ReadFile on DVD Drive

Scheduled Pinned Locked Moved Hardware & Devices
help
5 Posts 2 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • R Offline
    R Offline
    rctodd
    wrote on last edited by
    #1

    Hello - I'm working on a program that scans files on a DVD created by our product (to calculate a CRC). When using Windows Explorer to copy from the DVD (USB connection) to my C: drive, it takes about a minutes to copy a 1GB file. My program, however, takes close to ten minutes to read the file (doing nothing else). The following snippet demonstrates the problem:

    #define HEAP_SIZE 1000000 // 1,000,000 bytes
    static BYTE g_buffer[HEAP_SIZE];
    hFile = CreateFile( strThisFile, GENERIC_READ, 0, NULL, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, NULL);

    while ( ReadFile( hFile, g_buffer, HEAP_SIZE, &nBytesRead, NULL))
    {
    ++nNumberOfReads;

    if ( nBytesRead == 0)
    {
    break;
    }
    }

    Same results are observed when g_buffer is created using HeapAlloc and with varying HEAP_SIZE values. Any hints as to why I'm seeing this are appreciated! Thanks.

    L 1 Reply Last reply
    0
    • R rctodd

      Hello - I'm working on a program that scans files on a DVD created by our product (to calculate a CRC). When using Windows Explorer to copy from the DVD (USB connection) to my C: drive, it takes about a minutes to copy a 1GB file. My program, however, takes close to ten minutes to read the file (doing nothing else). The following snippet demonstrates the problem:

      #define HEAP_SIZE 1000000 // 1,000,000 bytes
      static BYTE g_buffer[HEAP_SIZE];
      hFile = CreateFile( strThisFile, GENERIC_READ, 0, NULL, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, NULL);

      while ( ReadFile( hFile, g_buffer, HEAP_SIZE, &nBytesRead, NULL))
      {
      ++nNumberOfReads;

      if ( nBytesRead == 0)
      {
      break;
      }
      }

      Same results are observed when g_buffer is created using HeapAlloc and with varying HEAP_SIZE values. Any hints as to why I'm seeing this are appreciated! Thanks.

      L Offline
      L Offline
      Luc Pattyn
      wrote on last edited by
      #2

      Why do you make it hard on the system ? - files get allocated by sectors and clusters on disk, these are all quantities that equal a power of 2. So a power of 10 for your HEAP_SIZE value is not so good. - files get cached somehow (the file cache in RAM, and the data cache L1/L2/L3 inside or close to the CPU chip); the hardware cache has limited size. So it probably is better to reduce HEAP_SIZE. I would suggest you try HEAP_SIZE values of 2^15 up to 2^18 (that's NOT C syntax !) I expect you can get around 50 to 100 MB/s on an average system And when organized well (two threads, ping-pong), CRC should be free. Good luck.

      Luc Pattyn


      try { [Search CP Articles] [Search CP Forums] [Forum Guidelines] [My Articles] } catch { [Google] }


      R 1 Reply Last reply
      0
      • L Luc Pattyn

        Why do you make it hard on the system ? - files get allocated by sectors and clusters on disk, these are all quantities that equal a power of 2. So a power of 10 for your HEAP_SIZE value is not so good. - files get cached somehow (the file cache in RAM, and the data cache L1/L2/L3 inside or close to the CPU chip); the hardware cache has limited size. So it probably is better to reduce HEAP_SIZE. I would suggest you try HEAP_SIZE values of 2^15 up to 2^18 (that's NOT C syntax !) I expect you can get around 50 to 100 MB/s on an average system And when organized well (two threads, ping-pong), CRC should be free. Good luck.

        Luc Pattyn


        try { [Search CP Articles] [Search CP Forums] [Forum Guidelines] [My Articles] } catch { [Google] }


        R Offline
        R Offline
        rctodd
        wrote on last edited by
        #3

        The notion that the block size should be consistent with cluster sizes makes a lot of sense. However, I tried several sizes (65536, 262140, 1MB, 4MB) all with the same (slow) results using the USB-connected DVD drive (advertised as a "16X"). I did notice something odd, though. Currently using a 4MB block size, I put a breakpoint in my loop and found that if I stepped the program (one or two ReadFile calls per second) that the drive motor sped up. I added a Sleep(1000) in the loop and found that my read rate went from about 1MB/sec to 4MB/sec (along with increase in motor speed). It's as if blocking on ReadFile is slowing things down. So I may experiment with overlapped IO (new to me as I spend most of my time working on embedded systems stuff). Thanks.

        L 1 Reply Last reply
        0
        • R rctodd

          The notion that the block size should be consistent with cluster sizes makes a lot of sense. However, I tried several sizes (65536, 262140, 1MB, 4MB) all with the same (slow) results using the USB-connected DVD drive (advertised as a "16X"). I did notice something odd, though. Currently using a 4MB block size, I put a breakpoint in my loop and found that if I stepped the program (one or two ReadFile calls per second) that the drive motor sped up. I added a Sleep(1000) in the loop and found that my read rate went from about 1MB/sec to 4MB/sec (along with increase in motor speed). It's as if blocking on ReadFile is slowing things down. So I may experiment with overlapped IO (new to me as I spend most of my time working on embedded systems stuff). Thanks.

          L Offline
          L Offline
          Luc Pattyn
          wrote on last edited by
          #4

          Hi, I missed the "USB connection" in your first post. That will reduce the performance you might hope to get to less than 40 MB/s. You should be able to reach whatever speed Windows Explorer gives you when copying a file from DVD to HDD. Have you tried using a FileStream ? You could loop reading say 4KB from a FileStream and let Windows worry about the opmization stuff, it is supposed to make the best of it... But then, the motor speed observations are surprising. I am not familiar with DVD on USB; I expect an optical drive to contain a cache capable of holding a couple of tracks, and to preread one track; so if you can transfer a track as fast as the disk rotates I dont expect particular sounds... :)

          Luc Pattyn


          try { [Search CP Articles] [Search CP Forums] [Forum Guidelines] [My Articles] } catch { [Google] }


          R 1 Reply Last reply
          0
          • L Luc Pattyn

            Hi, I missed the "USB connection" in your first post. That will reduce the performance you might hope to get to less than 40 MB/s. You should be able to reach whatever speed Windows Explorer gives you when copying a file from DVD to HDD. Have you tried using a FileStream ? You could loop reading say 4KB from a FileStream and let Windows worry about the opmization stuff, it is supposed to make the best of it... But then, the motor speed observations are surprising. I am not familiar with DVD on USB; I expect an optical drive to contain a cache capable of holding a couple of tracks, and to preread one track; so if you can transfer a track as fast as the disk rotates I dont expect particular sounds... :)

            Luc Pattyn


            try { [Search CP Articles] [Search CP Forums] [Forum Guidelines] [My Articles] } catch { [Google] }


            R Offline
            R Offline
            rctodd
            wrote on last edited by
            #5

            Haven't tried FileStream yet. I did stumble on an interesting item: I changed my call to CreateFile thus: hFile = CreateFile( strThisFile, GENERIC_READ, 0, NULL, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL | **FILE_FLAG_NO_BUFFERING**, NULL); The FILE_FLAG_NO_BUFFERING flag had a dramatic affect - read speeds in the 18 MB/sec range (block size also makes some difference here). Thanks for the suggestions - they're what got me thinking. Regards.

            1 Reply Last reply
            0
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Don't have an account? Register

            • Login or register to search.
            • First post
              Last post
            0
            • Categories
            • Recent
            • Tags
            • Popular
            • World
            • Users
            • Groups