|
Posted by Hubort on 12/06/05 12:31
Hello,
I'm not sure if this is php or apache related, but I have to begin with
sth, so I start with php :)
We have apache serving large files using php script. The function which
does it is quite simple and looks like this:
function readfile_chunked ($filename) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
}
return fclose($handle);
}
We use this function instead of standard php readfile function, because
it _should_ consume much less memory in case of large files (using
readfile function we get apache threads as large as files being served).
Unfortunately, this doesn't help a lot and we still have processess that
use huge amount of memory. Anybody knows what is wrong or how solve this
problem?
Any help much appreciated.
regards
Hubort
[Back to original message]
|