| 
	
 | 
 Posted by comp.lang.php on 06/17/88 11:50 
ZeldorBlat wrote: 
> comp.lang.php wrote: 
> > ZeldorBlat wrote: 
> > > comp.lang.php wrote: 
> > > > ZeldorBlat wrote: 
> > > > > comp.lang.php wrote: 
> > > > > > [PHP] 
> > > > > > if (!function_exists('bigfile')) { 
> > > > > >  /** 
> > > > > >    * Works like file() in PHP except that it will work more efficiently 
> > > > > > with very large files 
> > > > > >    * 
> > > > > >    * @access public 
> > > > > >    * @param mixed $fullFilePath 
> > > > > >    * @return array $lineArray 
> > > > > >    * @see actual_path 
> > > > > >    */ 
> > > > > >  function bigfile($fullFilePath) { 
> > > > > >  @ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); // 
> > > > > > MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE 
> > > > > > ENOUGH UNTIL END!) 
> > > > > >  $fileID = @fopen(actual_path($fullFilePath), 'r'); 
> > > > > >  while (@!feof($fileID)) { 
> > > > > >      $buffer = @fgets($fileID, 4096); 
> > > > > >      $lineArray[] = $buffer; 
> > > > > >  } 
> > > > > >  @fclose($fileID); 
> > > > > >  return $lineArray; 
> > > > > >  } 
> > > > > > } 
> > > > > > [/PHP] 
> > > > > > 
> > > > > > I even temporarily increase memory (I know, bad idea but it's all I can 
> > > > > > think of to do), however, requirements stipulate that files that are 
> > > > > > smaller than the max file size (arbitrarily set) be sent via email 
> > > > > > attachment (of course, depending on email SMTP server if it gets sent) 
> > > > > > 
> > > > > > I can't think of any other trick to make this either work or NOT to 
> > > > > > time out but throw an error/warning. 
> > > > > > 
> > > > > > Help! 
> > > > > > 
> > > > > > Thanx 
> > > > > > Phil 
> > > > > 
> > > > > What exactly are you trying to achieve?  What do you mean by "it 
> > > > > doesn't work?"  Some more details will help us suggest a solution... 
> > > > 
> > > > At the moment I am able to allow for large files to be broken up into 
> > > > an array by not using file() but by using my function above, bigfile(), 
> > > > by increasing memory temporarily, so it seems I solved it after all; I 
> > > > can accomplish the opening and parsing of larger files this way, so 
> > > > thanx! 
> > > > 
> > > > Phil 
> > > 
> > > I ask the question because you're trying to break it up into an array 
> > > of lines -- which suggests that you're doing something with the data on 
> > > a line-by-line basis.  If that's the case, why not read a single line, 
> > > do something with it, then read the next line?  Then you don't need to 
> > > load the whole thing into memory first. 
> > > 
> > > As I said before, though, it all depends on what you're trying to do. 
> > 
> > What I am trying to do is to load the file as an attachment to an 
> > auto-generated email. 
> > 
> > Phil 
> 
> So let me make sure I understand this.  You're trying to take a file 
> that's so large that the normal file handling mechanisims can't deal 
> with it, then send that massive file as an email attachment? 
 
No trying involved, I can do it now.  Just don't use file() but my own 
function, bigfile() and temporarily increase memory. 
 
Business requirement, plain and simple. 
 
Phil
 
  
Navigation:
[Reply to this message] 
 |