How to implement multi-part FTP
-
We have a need to send very large files to an internal FTP server. These files are videos that may last as long as an hour. All of the FTP sample I find have the standard code in them:
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Timeout = CONNECTION_TIMEOUT;
request.ReadWriteTimeout = CONNECTION_TIMEOUT;
request.UsePassive = false;
//Set credentials if necesary
if (credentials != null)
request.Credentials = credentials;
//Read source file
StreamReader sourceStream = new StreamReader(source);
byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd()); //<--- this is where we bomb!
sourceStream.Close();
request.ContentLength = fileContents.Length;
//Send contents
Stream requestStream = request.GetRequestStream();
requestStream.Write(fileContents, 0, fileContents.Length);
requestStream.Close();
This logic gives us an out-of-memory condition with very large files. What I'd like to do is have it send the file as a multi-part file sent as small chunks instead of a single large stream. Does anyone know how this is implemented?
Software Zen: delete this;
-
We have a need to send very large files to an internal FTP server. These files are videos that may last as long as an hour. All of the FTP sample I find have the standard code in them:
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Timeout = CONNECTION_TIMEOUT;
request.ReadWriteTimeout = CONNECTION_TIMEOUT;
request.UsePassive = false;
//Set credentials if necesary
if (credentials != null)
request.Credentials = credentials;
//Read source file
StreamReader sourceStream = new StreamReader(source);
byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd()); //<--- this is where we bomb!
sourceStream.Close();
request.ContentLength = fileContents.Length;
//Send contents
Stream requestStream = request.GetRequestStream();
requestStream.Write(fileContents, 0, fileContents.Length);
requestStream.Close();
This logic gives us an out-of-memory condition with very large files. What I'd like to do is have it send the file as a multi-part file sent as small chunks instead of a single large stream. Does anyone know how this is implemented?
Software Zen: delete this;
I see no reason to go for complex schemes, however it makes no sense to me that you first read the entire file into memory, and then send it in a single write. Both operations are stream operations, so use them as such, with small amounts, and in a loop. And then, I'm puzzled by the
Encoding.UTF8.GetBytes
statement; there is no text involved anywhere, so why would one need an Encoding? All it takes is byte transfers: byte array in, byte array out. :)Luc Pattyn [My Articles] Nil Volentibus Arduum
Fed up by FireFox memory leaks I switched to Opera and now CP doesn't perform its paste magic, so links will not be offered. Sorry.
-
I see no reason to go for complex schemes, however it makes no sense to me that you first read the entire file into memory, and then send it in a single write. Both operations are stream operations, so use them as such, with small amounts, and in a loop. And then, I'm puzzled by the
Encoding.UTF8.GetBytes
statement; there is no text involved anywhere, so why would one need an Encoding? All it takes is byte transfers: byte array in, byte array out. :)Luc Pattyn [My Articles] Nil Volentibus Arduum
Fed up by FireFox memory leaks I switched to Opera and now CP doesn't perform its paste magic, so links will not be offered. Sorry.
Well, Luc, you took my post too literally. The code I posted was one of the first examples I quickly grabbed. Essentially all the code is as follows: get a stream reader, read the data into a byte array, create a stream writer, feed it the byte array. Our issue as I noted in the example is that the byte[] buffer = stream.ReadToEnd() always throws an out of memory exception. I was ready that for very large files you can do a multi-part so that you read and transmit small packets and the FTP server puts the packets together into the original massive file. That is what I'm looking for so that we do not blow out our memory again.
Software Zen: delete this;
-
Well, Luc, you took my post too literally. The code I posted was one of the first examples I quickly grabbed. Essentially all the code is as follows: get a stream reader, read the data into a byte array, create a stream writer, feed it the byte array. Our issue as I noted in the example is that the byte[] buffer = stream.ReadToEnd() always throws an out of memory exception. I was ready that for very large files you can do a multi-part so that you read and transmit small packets and the FTP server puts the packets together into the original massive file. That is what I'm looking for so that we do not blow out our memory again.
Software Zen: delete this;
you should read my post and take it literally. Do NOT use ReadToEnd(); use a loop, and read and write smaller chunks. That is called streaming.
Luc Pattyn [My Articles] Nil Volentibus Arduum
Fed up by FireFox memory leaks I switched to Opera and now CP doesn't perform its paste magic, so links will not be offered. Sorry.
-
We have a need to send very large files to an internal FTP server. These files are videos that may last as long as an hour. All of the FTP sample I find have the standard code in them:
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Timeout = CONNECTION_TIMEOUT;
request.ReadWriteTimeout = CONNECTION_TIMEOUT;
request.UsePassive = false;
//Set credentials if necesary
if (credentials != null)
request.Credentials = credentials;
//Read source file
StreamReader sourceStream = new StreamReader(source);
byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd()); //<--- this is where we bomb!
sourceStream.Close();
request.ContentLength = fileContents.Length;
//Send contents
Stream requestStream = request.GetRequestStream();
requestStream.Write(fileContents, 0, fileContents.Length);
requestStream.Close();
This logic gives us an out-of-memory condition with very large files. What I'd like to do is have it send the file as a multi-part file sent as small chunks instead of a single large stream. Does anyone know how this is implemented?
Software Zen: delete this;
You just mean chunking the file? I've not heard of 'multi-part FTP'. If that is what you mean, it's a very common pattern; roughly speaking:
Stream inStream = GetInputStream(); // e.g. new FileStream() etc
Stream outStream = request.GetRequestStream();// Anything up to a few meg is okay, but making this similar to the TCP packet
// size makes sense
const int blocksize = 8192;
byte[] buf = new byte[blocksize];int read = 0;
while(0 < (read = inStream.ReadBytes(buf, blocksize)))
outStream.WriteBytes(buf, read);