|
Posted by Andy Jeffries on 05/03/06 14:30
On Wed, 03 May 2006 04:26:21 -0700, yehaimanish@gmail.com wrote:
> I am developing an application by which to parse the content from the
> access_log and insert it into the database. Since each row is an different
> entry, I am using file() to get the contents into an array and manipulate
> each row by foreach(...) and insert/update in the database accordingly.
>
> If the file is small, it works well. If the file is large (say > 5mb), it
> generates memory allocation error. However I came to know that the allowed
> memory size can be increased, but I want a different approach for it.
Why not use fgets, that returns the data a line at a time? So instead of
using your foreach you use:
$fh = fopen("access_log", "r");
while (!feof($fh)) {
$string = fgets($fh);
...
}
fclose($fh);
With regards to pagination, do that when it's in MySQL using LIMIT.
Cheers,
Andy
--
Andy Jeffries MBCS CITP ZCE | gPHPEdit Lead Developer
http://www.gphpedit.org | PHP editor for Gnome 2
http://www.andyjeffries.co.uk | Personal site and photos
Navigation:
[Reply to this message]
|