|
Posted by MS on 09/28/40 11:40
I Tried the set_time_limit(0);
and I have the same issue.
The particular piece of code is 4 calls deep would this have any baring in
the set_time_limit? do I need set_time_limit in each of the 4 calling
scripts?
Alex
> It looks like your script times out. Try set_time_limit(0);
>
> Hope this helps,
>
> DXTrim
>
> >
> > $handle = fopen($feed_url , "r");
> > while (($data = fgetcsv($handle, 2000, ",")) !== FALSE) {
> > set_time_limit(600);
> > ...
> > ...
> > }
> >
> > which basically reads the $feed_url link and processes the data and
inputs
> > into the database.
> >
> > this process is repeated for different $feed_urls
> >
> > And works for a few urls but the script just stops (with no
> error/warnings)
> > when it reaches around 12000 database entries
> >
> > I added set_time_limit(600); thinking that the script was timing out,
but
> > it doesn;t seem to have made any difference.
> >
> > My max_input_time is set to -1 (i dont know what the -1 denotes)
> >
> > Is there such a thing as timeout for the website i am recieving the data
> > from??
> >
> > Any other ideas what could be causing this?
> >
> > Alex
> > --
>
> --------------------------------------------------------------------------
> --
> > http://www.myclubweb.co.uk - The Home of Club Websites
>
> --------------------------------------------------------------------------
[Back to original message]
|