|
Posted by comp.lang.php on 06/14/06 22:56
[PHP]
if (!function_exists('bigfile')) {
/**
* Works like file() in PHP except that it will work more efficiently
with very large files
*
* @access public
* @param mixed $fullFilePath
* @return array $lineArray
* @see actual_path
*/
function bigfile($fullFilePath) {
@ini_set('memory_limit', (int)ini_get('memory_limit') * 10 . 'M'); //
MULTIPLY YOUR MEMORY TEMPORARILY (10X LARGER SHOULD *DEFINITELY* BE
ENOUGH UNTIL END!)
$fileID = @fopen(actual_path($fullFilePath), 'r');
while (@!feof($fileID)) {
$buffer = @fgets($fileID, 4096);
$lineArray[] = $buffer;
}
@fclose($fileID);
return $lineArray;
}
}
[/PHP]
I even temporarily increase memory (I know, bad idea but it's all I can
think of to do), however, requirements stipulate that files that are
smaller than the max file size (arbitrarily set) be sent via email
attachment (of course, depending on email SMTP server if it gets sent)
I can't think of any other trick to make this either work or NOT to
time out but throw an error/warning.
Help!
Thanx
Phil
Navigation:
[Reply to this message]
|