Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. The Lounge
  3. Large network file copy

Large network file copy

Scheduled Pinned Locked Moved The Lounge
toolsquestioncsharpphpdatabase
15 Posts 8 Posters 19 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C Chris Meech

    8GB in 8 hours? That's approx 250 KBPS. Are you really sure that software is going to fix this problem. My first take would be that you need a larger pipe. :)

    Chris Meech I am Canadian. [heard in a local bar] In theory there is no difference between theory and practice. In practice there is. [Yogi Berra]

    Y Offline
    Y Offline
    Yusuf
    wrote on last edited by
    #3

    Chris Meech wrote:

    8GB in 8 hours? That's approx 250 KBPS.

    That is all I get with RoboCopy. Since it uses buffered IO ( I read that somewhere). In fact, with the same pipe, I can get upto 500 KBPS using other tools. Between the two machines, we have a DSL connection with throughput of 3 MBPS.

    Chris Meech wrote:

    Are you really sure that software is going to fix this problem

    I am testing ESEUTIL.exe (which is Exchange server utility, which does unbuffered IO copy). I am hoping to get a better throughput with this tool. Another issue is disc defragmentation. Because of large file copy, it does it take long before the disc gets fragmented.

    Yusuf May I help you?

    1 Reply Last reply
    0
    • Y Yusuf

      As part of nightly backup job, we get a copy of the production database to one of the development testing machines. The database backup file size is little bit over 8GB, using RoboCopy it takes us close to 8 hours to copy. I am looking an alternate file copy utility that can copy large files fast. Googling for solution I came across Aaron Maxwell's suggestion.[^]. I am testing Roadkil's Unstoppable Copier [^] and TeraCopy[^]. Both tools claim that they can copy the file over the network in 4 hours, which is 50% faster. Do you copy large files over the network regularly? what tool(s) do you use? Any recommendations?

      Yusuf May I help you?

      C Offline
      C Offline
      CMTietgen
      wrote on last edited by
      #4

      That is super slow. We copy database backups totaling around 150-200GB to a backup server and restore the databases, then copy and restore the log files, some around 3-4GB, made since the last full backup all over 100MBps network and it's done around 12-15 hours later. Not sure why yours is so slow. I'm not a network guy, but maybe you have a bad NIC, wiring, running in half duplex mode? Someone more knowledgeable may know what to check. CT

      Y 1 Reply Last reply
      0
      • Y Yusuf

        As part of nightly backup job, we get a copy of the production database to one of the development testing machines. The database backup file size is little bit over 8GB, using RoboCopy it takes us close to 8 hours to copy. I am looking an alternate file copy utility that can copy large files fast. Googling for solution I came across Aaron Maxwell's suggestion.[^]. I am testing Roadkil's Unstoppable Copier [^] and TeraCopy[^]. Both tools claim that they can copy the file over the network in 4 hours, which is 50% faster. Do you copy large files over the network regularly? what tool(s) do you use? Any recommendations?

        Yusuf May I help you?

        E Offline
        E Offline
        Electron Shepherd
        wrote on last edited by
        #5

        Have you tried compressing the backup file before the copy?

        Server and Network Monitoring

        Y 1 Reply Last reply
        0
        • C CMTietgen

          That is super slow. We copy database backups totaling around 150-200GB to a backup server and restore the databases, then copy and restore the log files, some around 3-4GB, made since the last full backup all over 100MBps network and it's done around 12-15 hours later. Not sure why yours is so slow. I'm not a network guy, but maybe you have a bad NIC, wiring, running in half duplex mode? Someone more knowledgeable may know what to check. CT

          Y Offline
          Y Offline
          Yusuf
          wrote on last edited by
          #6

          CincDev wrote:

          Not sure why yours is so slow. I'm not a network guy, but maybe you have a bad NIC, wiring, running in half duplex mode? Someone more knowledgeable may know what to check.

          Good points, will check. Thanks

          Yusuf May I help you?

          1 Reply Last reply
          0
          • Y Yusuf

            As part of nightly backup job, we get a copy of the production database to one of the development testing machines. The database backup file size is little bit over 8GB, using RoboCopy it takes us close to 8 hours to copy. I am looking an alternate file copy utility that can copy large files fast. Googling for solution I came across Aaron Maxwell's suggestion.[^]. I am testing Roadkil's Unstoppable Copier [^] and TeraCopy[^]. Both tools claim that they can copy the file over the network in 4 hours, which is 50% faster. Do you copy large files over the network regularly? what tool(s) do you use? Any recommendations?

            Yusuf May I help you?

            R Offline
            R Offline
            Rob Graham
            wrote on last edited by
            #7

            I use Free Download Manager for this. It supports multiple simultaneous connections to the same file (copying it in sections), pause and restart. Significantly better than plain old copy over the network.FDM[^]

            Y 1 Reply Last reply
            0
            • E Electron Shepherd

              Have you tried compressing the backup file before the copy?

              Server and Network Monitoring

              Y Offline
              Y Offline
              Yusuf
              wrote on last edited by
              #8

              No, but I am not sure how much difference it may make. It is database back up file, and I am assuming it may be somewhat compressed. :~

              Yusuf May I help you?

              C M 2 Replies Last reply
              0
              • Y Yusuf

                No, but I am not sure how much difference it may make. It is database back up file, and I am assuming it may be somewhat compressed. :~

                Yusuf May I help you?

                C Offline
                C Offline
                CMTietgen
                wrote on last edited by
                #9

                Compress can make a difference even on database backups, if they don't contains binary data, however it's not worth it in my opinion. We used to compress our backups, copy them, decompress them and then restore them, which took longer than just copying the decompressed file. It wasn't worth the time or drive space since you have to have plenty of room for the compressed file and the temporary file during decompression. Some of our databases are 21GB or more, so that's 40GB+ to decompress the file, and the loser who built our server got the smallest possible drives he could find, so we were always running low on space. Thank God we are getting another new server with a disk array soon. :) Anyway, don't worry about compression unless you plan to leave them compressed and not restore them. CT

                Y 1 Reply Last reply
                0
                • C CMTietgen

                  Compress can make a difference even on database backups, if they don't contains binary data, however it's not worth it in my opinion. We used to compress our backups, copy them, decompress them and then restore them, which took longer than just copying the decompressed file. It wasn't worth the time or drive space since you have to have plenty of room for the compressed file and the temporary file during decompression. Some of our databases are 21GB or more, so that's 40GB+ to decompress the file, and the loser who built our server got the smallest possible drives he could find, so we were always running low on space. Thank God we are getting another new server with a disk array soon. :) Anyway, don't worry about compression unless you plan to leave them compressed and not restore them. CT

                  Y Offline
                  Y Offline
                  Yusuf
                  wrote on last edited by
                  #10

                  :-D

                  Yusuf May I help you?

                  1 Reply Last reply
                  0
                  • R Rob Graham

                    I use Free Download Manager for this. It supports multiple simultaneous connections to the same file (copying it in sections), pause and restart. Significantly better than plain old copy over the network.FDM[^]

                    Y Offline
                    Y Offline
                    Yusuf
                    wrote on last edited by
                    #11

                    Thanks Rob, I'll check it out.

                    Yusuf May I help you?

                    1 Reply Last reply
                    0
                    • Y Yusuf

                      No, but I am not sure how much difference it may make. It is database back up file, and I am assuming it may be somewhat compressed. :~

                      Yusuf May I help you?

                      M Offline
                      M Offline
                      Mycroft Holmes
                      wrote on last edited by
                      #12

                      We use Red-Gate backup on SQL 2005 and the shrinkage is impressive around 85%, most(all) of our data is text.

                      1 Reply Last reply
                      0
                      • Y Yusuf

                        As part of nightly backup job, we get a copy of the production database to one of the development testing machines. The database backup file size is little bit over 8GB, using RoboCopy it takes us close to 8 hours to copy. I am looking an alternate file copy utility that can copy large files fast. Googling for solution I came across Aaron Maxwell's suggestion.[^]. I am testing Roadkil's Unstoppable Copier [^] and TeraCopy[^]. Both tools claim that they can copy the file over the network in 4 hours, which is 50% faster. Do you copy large files over the network regularly? what tool(s) do you use? Any recommendations?

                        Yusuf May I help you?

                        G Offline
                        G Offline
                        Gary McDonnell
                        wrote on last edited by
                        #13

                        I use Rsync for Windows. Once the initial backup is complete Rsync backs up only those portions of the file that have changed - ideal for WAN / Internet based backups. Plus the software is free. The implementation I use is called DeltaCopy. I replicate about 50 gigs of data via a 512 kbps uplink every night...takes about 90 minutes. This includes two Exchange databases and a few small SQL databases. Again, only the changes to the files are moved. Rsync originated in the Unix world and as with many Unix utilities is a command line program with all kinds of parameters - but it's no more complicated than RoboCopy. DeltaCopy wraps Rsync in a simple GUI. http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp[^]

                        Y 1 Reply Last reply
                        0
                        • G Gary McDonnell

                          I use Rsync for Windows. Once the initial backup is complete Rsync backs up only those portions of the file that have changed - ideal for WAN / Internet based backups. Plus the software is free. The implementation I use is called DeltaCopy. I replicate about 50 gigs of data via a 512 kbps uplink every night...takes about 90 minutes. This includes two Exchange databases and a few small SQL databases. Again, only the changes to the files are moved. Rsync originated in the Unix world and as with many Unix utilities is a command line program with all kinds of parameters - but it's no more complicated than RoboCopy. DeltaCopy wraps Rsync in a simple GUI. http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp[^]

                          Y Offline
                          Y Offline
                          Yusuf
                          wrote on last edited by
                          #14

                          Gary, DeltaCopy looks intersting, I'll give it a test drive. Thanks

                          Yusuf May I help you?

                          1 Reply Last reply
                          0
                          • Y Yusuf

                            As part of nightly backup job, we get a copy of the production database to one of the development testing machines. The database backup file size is little bit over 8GB, using RoboCopy it takes us close to 8 hours to copy. I am looking an alternate file copy utility that can copy large files fast. Googling for solution I came across Aaron Maxwell's suggestion.[^]. I am testing Roadkil's Unstoppable Copier [^] and TeraCopy[^]. Both tools claim that they can copy the file over the network in 4 hours, which is 50% faster. Do you copy large files over the network regularly? what tool(s) do you use? Any recommendations?

                            Yusuf May I help you?

                            E Offline
                            E Offline
                            ecooke
                            wrote on last edited by
                            #15

                            have you tried automated ftp scripts using scheduled tasks? you might have to take the db down, copy to another file, bring db backup and do it. or in an sql task, run a backup, do an ftp of 2 files (the db backup and a seperate empty text file AFTER the db backup gets uploaded) then on the remote server have something running looking for that empty text file, which would then do a restore. Thats how did our backups, granted it was just to wait for windows backups before starting the tape backups.

                            Those who are too smart to engage in politics are punished by being governed by those who are dumber. - Aristotle

                            1 Reply Last reply
                            0
                            Reply
                            • Reply as topic
                            Log in to reply
                            • Oldest to Newest
                            • Newest to Oldest
                            • Most Votes


                            • Login

                            • Don't have an account? Register

                            • Login or register to search.
                            • First post
                              Last post
                            0
                            • Categories
                            • Recent
                            • Tags
                            • Popular
                            • World
                            • Users
                            • Groups