|
Posted by J.O. Aho on 12/06/05 12:54
Hubort wrote:
> Hello,
>
> I'm not sure if this is php or apache related, but I have to begin with
> sth, so I start with php :)
>
> We have apache serving large files using php script. The function which
> does it is quite simple and looks like this:
>
> function readfile_chunked ($filename) {
> $chunksize = 1*(1024*1024); // how many bytes per chunk
> $buffer = '';
> $handle = fopen($filename, 'rb');
> if ($handle === false) {
> return false;
> }
> while (!feof($handle)) {
> $buffer = fread($handle, $chunksize);
> echo $buffer;
> ob_flush();
> flush();
> }
> return fclose($handle);
> }
> We use this function instead of standard php readfile function, because
> it _should_ consume much less memory in case of large files (using
> readfile function we get apache threads as large as files being served).
> Unfortunately, this doesn't help a lot and we still have processess that
> use huge amount of memory. Anybody knows what is wrong or how solve this
> problem?
I think (not sure) the problem here is the output-buffer, you should run
ob_end_flush(), as both ob_flush() and flush() does keep the output-buffer.
//Aho
[Back to original message]
|