|
Posted by DXTrim on 02/22/06 16:20
Put the time limit only on top of the script and take it outside the loop.
If that does not help, something is causing the time out, but it is dificult
to say without seeing the code and the information is getting retrieved.
Cheers,
DXTrim
"MS" <nospamplaesegr8t_ukuk@yahoo.co.uk> wrote in message
news:dtfcm8$53f$1@nwrdmz01.dmz.ncs.ea.ibs-infra.bt.com...
> I Tried the set_time_limit(0);
> and I have the same issue.
>
> The particular piece of code is 4 calls deep would this have any baring in
> the set_time_limit? do I need set_time_limit in each of the 4 calling
> scripts?
>
> Alex
>
> > It looks like your script times out. Try set_time_limit(0);
> >
> > Hope this helps,
> >
> > DXTrim
> >
> > >
> > > $handle = fopen($feed_url , "r");
> > > while (($data = fgetcsv($handle, 2000, ",")) !== FALSE) {
> > > set_time_limit(600);
> > > ...
> > > ...
> > > }
> > >
> > > which basically reads the $feed_url link and processes the data and
> inputs
> > > into the database.
> > >
> > > this process is repeated for different $feed_urls
> > >
> > > And works for a few urls but the script just stops (with no
> > error/warnings)
> > > when it reaches around 12000 database entries
> > >
> > > I added set_time_limit(600); thinking that the script was timing out,
> but
> > > it doesn;t seem to have made any difference.
> > >
> > > My max_input_time is set to -1 (i dont know what the -1 denotes)
> > >
> > > Is there such a thing as timeout for the website i am recieving the
data
> > > from??
> > >
> > > Any other ideas what could be causing this?
> > >
> > > Alex
> > > --
> >
>
> --------------------------------------------------------------------------
> > --
> > > http://www.myclubweb.co.uk - The Home of Club Websites
> >
>
> --------------------------------------------------------------------------
>
>
[Back to original message]
|