Reply to can't read large files - help

Your name:

Reply:


Posted by comp.lang.php on 06/14/06 22:56

[PHP]
if (!function_exists('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFilePath) {
@ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_path($fullFilePath), 'r');
while (@!feof($fileID)) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID);
return $lineArray;
}
}
[/PHP]

I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)

I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.

Help!

Thanx
Phil

[Back to original message]


Удаленная работа для программистов  •  Как заработать на Google AdSense  •  England, UK  •  статьи на английском  •  PHP MySQL CMS Apache Oscommerce  •  Online Business Knowledge Base  •  DVD MP3 AVI MP4 players codecs conversion help
Home  •  Search  •  Site Map  •  Set as Homepage  •  Add to Favourites

Copyright © 2005-2006 Powered by Custom PHP Programming

Сайт изготовлен в Студии Валентина Петручека
изготовление и поддержка веб-сайтов, разработка программного обеспечения, поисковая оптимизация