|
Posted by MS on 09/28/64 11:40
I have a script as below
$handle = fopen($feed_url , "r");
while (($data = fgetcsv($handle, 2000, ",")) !== FALSE) {
set_time_limit(600);
...
...
}
which basically reads the $feed_url link and processes the data and inputs
into the database.
this process is repeated for different $feed_urls
And works for a few urls but the script just stops (with no error/warnings)
when it reaches around 12000 database entries
I added set_time_limit(600); thinking that the script was timing out, but
it doesn;t seem to have made any difference.
My max_input_time is set to -1 (i dont know what the -1 denotes)
Is there such a thing as timeout for the website i am recieving the data
from??
Any other ideas what could be causing this?
Alex
--
----------------------------------------------------------------------------
http://www.myclubweb.co.uk - The Home of Club Websites
----------------------------------------------------------------------------
Navigation:
[Reply to this message]
|