|
Posted by Petr Smith on 10/26/05 13:56
Hi,
yes, there could be some problem with your code. But it depends on what
are you trying to archieve.
If you are trying to put whole file to database, there is no reason to
use fgets function. Just use $buffer=file_get_contents($filename); and
put the buffer into SQL. No reason to read it line by line using fgets.
The whole file will end in memory in the SQL section, so it doesn't
matter ho you read it.
If you want to process huge file line by line, it's really good idea to
not put whole file to memory (using file_get_contents) but to read it
line by line using fgets, process every line and forgot it. The file
could be very large and you still need only very small memory amount.
And there can be problem you are saying.
What if there is some line longer than 4096 bytes (or some bigger
limit)?? You can solve it simply using code like this:
$dataFile = fopen( $filename, "r" ) ;
$buffer = "";
while (!feof($dataFile)) {
$buffer .= fgets($dataFile, 4096);
if (strstr($buffer, "\n") === false && !feof($dataFile)) {
// long line
continue;
}
echo $buffer;
$buffer = "";
}
Petr
John Taylor-Johnston wrote:
> It does what I want, but I worry 4096 may not be big enough. Possible?
> Is there a way to detect the filesize and insert a value for 4096?
> $buffer = fgets($dataFile, $filesize);
> Is this what it is for?
> John
>
>
> <?php
> #http://ca3.php.net/fopen
> $filename = "/var/www/html2/assets/about.htm" ;
> $dataFile = fopen( $filename, "r" ) ;
> while (!feof($dataFile))
> {
> $buffer = fgets($dataFile, 4096);
> echo $buffer;
> }
> $sql = "insert into table ... $buffer";
> ?>
Navigation:
[Reply to this message]
|