|
Posted by DXTrim on 09/28/72 11:40
Hello,
It looks like your script times out. Try set_time_limit(0);
Hope this helps,
DXTrim
------------------------------------------------------
<?
echo "please post to the group for the benefit of all";
?>
"MS" <nospamplaesegr8t_ukuk@yahoo.co.uk> wrote in message
news:dtd2pc$iro$1@nwrdmz03.dmz.ncs.ea.ibs-infra.bt.com...
> I have a script as below
>
> $handle = fopen($feed_url , "r");
> while (($data = fgetcsv($handle, 2000, ",")) !== FALSE) {
> set_time_limit(600);
> ...
> ...
> }
>
> which basically reads the $feed_url link and processes the data and inputs
> into the database.
>
> this process is repeated for different $feed_urls
>
> And works for a few urls but the script just stops (with no
error/warnings)
> when it reaches around 12000 database entries
>
> I added set_time_limit(600); thinking that the script was timing out, but
> it doesn;t seem to have made any difference.
>
> My max_input_time is set to -1 (i dont know what the -1 denotes)
>
> Is there such a thing as timeout for the website i am recieving the data
> from??
>
> Any other ideas what could be causing this?
>
> Alex
> --
> --------------------------------------------------------------------------
--
> http://www.myclubweb.co.uk - The Home of Club Websites
> --------------------------------------------------------------------------
--
>
>
>
[Back to original message]
|