|
Posted by ZeldorBlat on 06/15/06 03:03
comp.lang.php wrote:
> [PHP]
> if (!function_exists('bigfile')) {
> /**
> * Works like file() in PHP except that it will work more efficiently
> with very large files
> *
> * @access public
> * @param mixed $fullFilePath
> * @return array $lineArray
> * @see actual_path
> */
> function bigfile($fullFilePath) {
> @ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
> MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
> ENOUGH UNTIL END!)
> $fileID = @fopen(actual_path($fullFilePath), 'r');
> while (@!feof($fileID)) {
> $buffer = @fgets($fileID, 4096);
> $lineArray[] = $buffer;
> }
> @fclose($fileID);
> return $lineArray;
> }
> }
> [/PHP]
>
> I even temporarily increase memory (I know, bad idea but it's all I can
> think of to do), however, requirements stipulate that files that are
> smaller than the max file size (arbitrarily set) be sent via email
> attachment (of course, depending on email SMTP server if it gets sent)
>
> I can't think of any other trick to make this either work or NOT to
> time out but throw an error/warning.
>
> Help!
>
> Thanx
> Phil
What exactly are you trying to achieve? What do you mean by "it
doesn't work?" Some more details will help us suggest a solution...
[Back to original message]
|