Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Code Project
  1. Home
  2. General Programming
  3. C#
  4. How to implement multi-part FTP

How to implement multi-part FTP

Scheduled Pinned Locked Moved C#
sysadminperformancetutorialquestion
5 Posts 3 Posters 0 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • M Offline
    M Offline
    Michael J Eber
    wrote on last edited by
    #1

    We have a need to send very large files to an internal FTP server. These files are videos that may last as long as an hour. All of the FTP sample I find have the standard code in them:

    request.Method = WebRequestMethods.Ftp.UploadFile;

    request.Timeout = CONNECTION_TIMEOUT;

    request.ReadWriteTimeout = CONNECTION_TIMEOUT;

    request.UsePassive = false;

    //Set credentials if necesary

    if (credentials != null)

    request.Credentials = credentials;

    //Read source file

    StreamReader sourceStream = new StreamReader(source);

    byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd()); //<--- this is where we bomb!

    sourceStream.Close();

    request.ContentLength = fileContents.Length;

    //Send contents

    Stream requestStream = request.GetRequestStream();

    requestStream.Write(fileContents, 0, fileContents.Length);

    requestStream.Close();

    This logic gives us an out-of-memory condition with very large files. What I'd like to do is have it send the file as a multi-part file sent as small chunks instead of a single large stream. Does anyone know how this is implemented?

    Software Zen: delete this;

    L B 2 Replies Last reply
    0
    • M Michael J Eber

      We have a need to send very large files to an internal FTP server. These files are videos that may last as long as an hour. All of the FTP sample I find have the standard code in them:

      request.Method = WebRequestMethods.Ftp.UploadFile;

      request.Timeout = CONNECTION_TIMEOUT;

      request.ReadWriteTimeout = CONNECTION_TIMEOUT;

      request.UsePassive = false;

      //Set credentials if necesary

      if (credentials != null)

      request.Credentials = credentials;

      //Read source file

      StreamReader sourceStream = new StreamReader(source);

      byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd()); //<--- this is where we bomb!

      sourceStream.Close();

      request.ContentLength = fileContents.Length;

      //Send contents

      Stream requestStream = request.GetRequestStream();

      requestStream.Write(fileContents, 0, fileContents.Length);

      requestStream.Close();

      This logic gives us an out-of-memory condition with very large files. What I'd like to do is have it send the file as a multi-part file sent as small chunks instead of a single large stream. Does anyone know how this is implemented?

      Software Zen: delete this;

      L Offline
      L Offline
      Luc Pattyn
      wrote on last edited by
      #2

      I see no reason to go for complex schemes, however it makes no sense to me that you first read the entire file into memory, and then send it in a single write. Both operations are stream operations, so use them as such, with small amounts, and in a loop. And then, I'm puzzled by the Encoding.UTF8.GetBytes statement; there is no text involved anywhere, so why would one need an Encoding? All it takes is byte transfers: byte array in, byte array out. :)

      Luc Pattyn [My Articles] Nil Volentibus Arduum


      Fed up by FireFox memory leaks I switched to Opera and now CP doesn't perform its paste magic, so links will not be offered. Sorry.

      M 1 Reply Last reply
      0
      • L Luc Pattyn

        I see no reason to go for complex schemes, however it makes no sense to me that you first read the entire file into memory, and then send it in a single write. Both operations are stream operations, so use them as such, with small amounts, and in a loop. And then, I'm puzzled by the Encoding.UTF8.GetBytes statement; there is no text involved anywhere, so why would one need an Encoding? All it takes is byte transfers: byte array in, byte array out. :)

        Luc Pattyn [My Articles] Nil Volentibus Arduum


        Fed up by FireFox memory leaks I switched to Opera and now CP doesn't perform its paste magic, so links will not be offered. Sorry.

        M Offline
        M Offline
        Michael J Eber
        wrote on last edited by
        #3

        Well, Luc, you took my post too literally. The code I posted was one of the first examples I quickly grabbed. Essentially all the code is as follows: get a stream reader, read the data into a byte array, create a stream writer, feed it the byte array. Our issue as I noted in the example is that the byte[] buffer = stream.ReadToEnd() always throws an out of memory exception. I was ready that for very large files you can do a multi-part so that you read and transmit small packets and the FTP server puts the packets together into the original massive file. That is what I'm looking for so that we do not blow out our memory again.

        Software Zen: delete this;

        L 1 Reply Last reply
        0
        • M Michael J Eber

          Well, Luc, you took my post too literally. The code I posted was one of the first examples I quickly grabbed. Essentially all the code is as follows: get a stream reader, read the data into a byte array, create a stream writer, feed it the byte array. Our issue as I noted in the example is that the byte[] buffer = stream.ReadToEnd() always throws an out of memory exception. I was ready that for very large files you can do a multi-part so that you read and transmit small packets and the FTP server puts the packets together into the original massive file. That is what I'm looking for so that we do not blow out our memory again.

          Software Zen: delete this;

          L Offline
          L Offline
          Luc Pattyn
          wrote on last edited by
          #4

          you should read my post and take it literally. Do NOT use ReadToEnd(); use a loop, and read and write smaller chunks. That is called streaming.

          Luc Pattyn [My Articles] Nil Volentibus Arduum


          Fed up by FireFox memory leaks I switched to Opera and now CP doesn't perform its paste magic, so links will not be offered. Sorry.

          1 Reply Last reply
          0
          • M Michael J Eber

            We have a need to send very large files to an internal FTP server. These files are videos that may last as long as an hour. All of the FTP sample I find have the standard code in them:

            request.Method = WebRequestMethods.Ftp.UploadFile;

            request.Timeout = CONNECTION_TIMEOUT;

            request.ReadWriteTimeout = CONNECTION_TIMEOUT;

            request.UsePassive = false;

            //Set credentials if necesary

            if (credentials != null)

            request.Credentials = credentials;

            //Read source file

            StreamReader sourceStream = new StreamReader(source);

            byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd()); //<--- this is where we bomb!

            sourceStream.Close();

            request.ContentLength = fileContents.Length;

            //Send contents

            Stream requestStream = request.GetRequestStream();

            requestStream.Write(fileContents, 0, fileContents.Length);

            requestStream.Close();

            This logic gives us an out-of-memory condition with very large files. What I'd like to do is have it send the file as a multi-part file sent as small chunks instead of a single large stream. Does anyone know how this is implemented?

            Software Zen: delete this;

            B Offline
            B Offline
            BobJanova
            wrote on last edited by
            #5

            You just mean chunking the file? I've not heard of 'multi-part FTP'. If that is what you mean, it's a very common pattern; roughly speaking:

            Stream inStream = GetInputStream(); // e.g. new FileStream() etc
            Stream outStream = request.GetRequestStream();

            // Anything up to a few meg is okay, but making this similar to the TCP packet
            // size makes sense
            const int blocksize = 8192;
            byte[] buf = new byte[blocksize];

            int read = 0;
            while(0 < (read = inStream.ReadBytes(buf, blocksize)))
            outStream.WriteBytes(buf, read);

            1 Reply Last reply
            0
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Don't have an account? Register

            • Login or register to search.
            • First post
              Last post
            0
            • Categories
            • Recent
            • Tags
            • Popular
            • World
            • Users
            • Groups