|
Posted by comp.lang.php on 10/01/50 11:50
ZeldorBlat wrote:
> comp.lang.php wrote:
> > [PHP]
> > if (!function_exists('bigfile')) {
> > /**
> > * Works like file() in PHP except that it will work more efficiently
> > with very large files
> > *
> > * @access public
> > * @param mixed $fullFilePath
> > * @return array $lineArray
> > * @see actual_path
> > */
> > function bigfile($fullFilePath) {
> > @ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
> > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
> > ENOUGH UNTIL END!)
> > $fileID = @fopen(actual_path($fullFilePath), 'r');
> > while (@!feof($fileID)) {
> > $buffer = @fgets($fileID, 4096);
> > $lineArray[] = $buffer;
> > }
> > @fclose($fileID);
> > return $lineArray;
> > }
> > }
> > [/PHP]
> >
> > I even temporarily increase memory (I know, bad idea but it's all I can
> > think of to do), however, requirements stipulate that files that are
> > smaller than the max file size (arbitrarily set) be sent via email
> > attachment (of course, depending on email SMTP server if it gets sent)
> >
> > I can't think of any other trick to make this either work or NOT to
> > time out but throw an error/warning.
> >
> > Help!
> >
> > Thanx
> > Phil
>
> What exactly are you trying to achieve? What do you mean by "it
> doesn't work?" Some more details will help us suggest a solution...
At the moment I am able to allow for large files to be broken up into
an array by not using file() but by using my function above, bigfile(),
by increasing memory temporarily, so it seems I solved it after all; I
can accomplish the opening and parsing of larger files this way, so
thanx!
Phil
Navigation:
[Reply to this message]
|