|
Posted by mootmail-googlegroups on 09/13/06 13:37
Jonah Bishop wrote:
> I totally agree with you, mootmail. I'd rather not override the script
> timeout value, because I'm guaranteed (to some extent) that this script
> will take longer than that.
>
> This is indeed the final step in the importing process, so I really
> like the idea of having another PHP script handle the long operation.
> But how do I get *that* script to avoid timeouts? Is sending it to the
> background sufficient? Won't it still be susceptible to the PHP timeout
> value?
>
> Additionally, how would I pass parameters to such a script?
>
> Many thanks!
> -- Jonah
As far as I know, scripts executed in command-line mode do not have a
timeout value (I have some scripts that take 2+ hours to run and don't
timeout). I'm not sure if starting a script as a background process
from a web script carries with it the default timeout, I've never tried
it that way.
Either way, you could do set_timeout_limit(0) just to be safe. After
all, once it's in the background, it doesn't really matter how long it
runs.
As for passing parameters, I've never needed to, but I suspect a good
place to start would be here:
http://us3.php.net/manual/en/features.commandline.php A little ways
down it starts talking about how to pass arguments to a script from the
command line. It looks like you can pass arguments to the script and
then access them via $argv in your code.
If that doesn't work, and you decide to go the cron periodic route,
then you could always just keep a table 'queue' holding all the
parameters that should be passed, which your script could just parse
row by row to get the parameters.
[Back to original message]
|