How to debug PHP Memory Leak
-
Hi guys, using MAMP v2.0 on Mac ____ Apache/2.0.64 (Unix) -- PHP/5.3.5 -- DAV/2 mod_ssl/2.0.64 -- OpenSSL/0.9.7l -- MySQL 5.5.9 I have a script I am trying to run and It appears to be giving me major memory leaks, which I have attempted to debug and cannot work out how to fix. Basically, the script is part of a file manager module. It processes the download of a file when given an ID. The entire file is stored in a database table, as a BLOB, in 64kb chunks (per record), and is streamed down to the client on request.
Database: file_management
Tables: file_details, file_data
file_details:
FileID - int(10) AUTO_INCREMENT
FileTypeID - int(10)
FileType - varchar(60)
FileName - varchar(255)
FileDescription - varchar(255)
FileSize - bigint(20)
FileUploadDate - datetime
FileUploadBy - int(5)file_details:
FileDataID - int(10) AUTO_INCREMENT
FileID - int(10)
FileData - BLOBThe error I am actually getting is this one (from php error log): [31-Oct-2011 09:47:39] PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 63326173 bytes) in /root/htdocs/file_manager/file_manager_download.php on line 150 Now, the actual function of downloading works if the file is small enough, in this case, less than 40mb, however once it goes over that, like the 60mb file in the error above, it fails. All it does is download a 0kb file. Obviously, 134217728 bytes is more than 63326173 bytes (128mb vs 60mb). the Allowed memory size of 134217728 bytes is the directive in php.ini: "memory_limit = 128M ; Maximum amount of memory a script may consume" If I set this to 256M, it allows me to download that 60mb file, as well as up to about an 80mb file. Also, if I set this to 1024M it allows me to download a 260mb file and possibly bigger. So you can see that the problem is a leak somewhere in the script that is eating up all the memory. Here is the download script:
-
Hi guys, using MAMP v2.0 on Mac ____ Apache/2.0.64 (Unix) -- PHP/5.3.5 -- DAV/2 mod_ssl/2.0.64 -- OpenSSL/0.9.7l -- MySQL 5.5.9 I have a script I am trying to run and It appears to be giving me major memory leaks, which I have attempted to debug and cannot work out how to fix. Basically, the script is part of a file manager module. It processes the download of a file when given an ID. The entire file is stored in a database table, as a BLOB, in 64kb chunks (per record), and is streamed down to the client on request.
Database: file_management
Tables: file_details, file_data
file_details:
FileID - int(10) AUTO_INCREMENT
FileTypeID - int(10)
FileType - varchar(60)
FileName - varchar(255)
FileDescription - varchar(255)
FileSize - bigint(20)
FileUploadDate - datetime
FileUploadBy - int(5)file_details:
FileDataID - int(10) AUTO_INCREMENT
FileID - int(10)
FileData - BLOBThe error I am actually getting is this one (from php error log): [31-Oct-2011 09:47:39] PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 63326173 bytes) in /root/htdocs/file_manager/file_manager_download.php on line 150 Now, the actual function of downloading works if the file is small enough, in this case, less than 40mb, however once it goes over that, like the 60mb file in the error above, it fails. All it does is download a 0kb file. Obviously, 134217728 bytes is more than 63326173 bytes (128mb vs 60mb). the Allowed memory size of 134217728 bytes is the directive in php.ini: "memory_limit = 128M ; Maximum amount of memory a script may consume" If I set this to 256M, it allows me to download that 60mb file, as well as up to about an 80mb file. Also, if I set this to 1024M it allows me to download a 260mb file and possibly bigger. So you can see that the problem is a leak somewhere in the script that is eating up all the memory. Here is the download script:
I have partially solved this problem. I was told on another forum to add ob_flush(); to the While loop where I am echoing out my file nodes. This appears to work in part; it allows me to download a 140mb file with only 128M memory_limit directive set in php.ini However, when I attempt to download a 250mb file the error is thrown again:
[31-Oct-2011 11:46:01] PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 132392961 bytes) in /root/htdocs/file_manager/file_manager_download.php on line 125
[31-Oct-2011 11:46:01] PHP Stack trace:
[31-Oct-2011 11:46:01] PHP 1. {main}() /root/htdocs/file_manager/file_manager_download.php:0
[31-Oct-2011 11:46:01] PHP 2. ob_flush() /root/htdocs/file_manager/file_manager_download.php:125As you can see, it is getting within 2mb of the limit and then failing, and I can actually see it streaming the file to the browser, it gets to about 200mb of the 250mb file, and fails. The while loop looks like:
while ($row_GetFileDataBlobs = mysql_fetch_array($result_GetFileDataBlobs)) {
echo $row_GetFileDataBlobs["FileData"];
ob_flush();
}so it appears to be working better than before, but that REALLY large files dont work. The question is then, do I need to increase my memory_limit directive, or should it be streaming effectively with the limit of 128M but I still have a problem with my code and the way it is, or is not, trashing the memory it doesn't need. Cheers Mick
-
I have partially solved this problem. I was told on another forum to add ob_flush(); to the While loop where I am echoing out my file nodes. This appears to work in part; it allows me to download a 140mb file with only 128M memory_limit directive set in php.ini However, when I attempt to download a 250mb file the error is thrown again:
[31-Oct-2011 11:46:01] PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 132392961 bytes) in /root/htdocs/file_manager/file_manager_download.php on line 125
[31-Oct-2011 11:46:01] PHP Stack trace:
[31-Oct-2011 11:46:01] PHP 1. {main}() /root/htdocs/file_manager/file_manager_download.php:0
[31-Oct-2011 11:46:01] PHP 2. ob_flush() /root/htdocs/file_manager/file_manager_download.php:125As you can see, it is getting within 2mb of the limit and then failing, and I can actually see it streaming the file to the browser, it gets to about 200mb of the 250mb file, and fails. The while loop looks like:
while ($row_GetFileDataBlobs = mysql_fetch_array($result_GetFileDataBlobs)) {
echo $row_GetFileDataBlobs["FileData"];
ob_flush();
}so it appears to be working better than before, but that REALLY large files dont work. The question is then, do I need to increase my memory_limit directive, or should it be streaming effectively with the limit of 128M but I still have a problem with my code and the way it is, or is not, trashing the memory it doesn't need. Cheers Mick
Ok, I have added to it and resolved the problem. The problem was two fold. 1 - I wasn't using ob_flush() inside the while loop. I added that in and it appeared to free up a lot of memory, enabling larger downloads, but not unlimited. For example, with memory_limit = 128M i could now download more than 40mb, in fact i could now get up to around 200mb. But this is where it failed again. First memory issue problem solved though. LESSON 1: Flush your objects! 2 - I was using mysql_query to retrieve the results for my SQL Query. The problem is that it buffers these results, and this was adding to my memory limit issue. I ended up using mysql_unbuffered_query instead and this now works flawlessly. This does however come with some limitations, that it locks your table while reading results. LESSON 2: Don't buffer mysql results if not required! (within programmatic limitations) FINAL LESSON: All of these fixes work, however, it requires some more testing to ensure that there is no problems with the combination of them. Also, I have learned a lot more about objects and php memory allocation, I just wish there was a way to visually debug the process a little better than what xdebug offers. If anyone has any ideas on how xdebug could have actually shed some light on this process, please let me know in the comments. Hope this helps someone else out in the future.
-
Ok, I have added to it and resolved the problem. The problem was two fold. 1 - I wasn't using ob_flush() inside the while loop. I added that in and it appeared to free up a lot of memory, enabling larger downloads, but not unlimited. For example, with memory_limit = 128M i could now download more than 40mb, in fact i could now get up to around 200mb. But this is where it failed again. First memory issue problem solved though. LESSON 1: Flush your objects! 2 - I was using mysql_query to retrieve the results for my SQL Query. The problem is that it buffers these results, and this was adding to my memory limit issue. I ended up using mysql_unbuffered_query instead and this now works flawlessly. This does however come with some limitations, that it locks your table while reading results. LESSON 2: Don't buffer mysql results if not required! (within programmatic limitations) FINAL LESSON: All of these fixes work, however, it requires some more testing to ensure that there is no problems with the combination of them. Also, I have learned a lot more about objects and php memory allocation, I just wish there was a way to visually debug the process a little better than what xdebug offers. If anyone has any ideas on how xdebug could have actually shed some light on this process, please let me know in the comments. Hope this helps someone else out in the future.
Also use unset function to removed unused variables and array..