|
Posted by kiretose on 11/14/07 22:17
On Wed, 14 Nov 2007 20:34:27 +0000, kiretose wrote:
> Hey guys,
>
> Currently designing a web crawler of sorts to download some XML feeds
> and parse the data, was hoping to get it working in shared hosting
> environments. Because it only scans slowly and is designed to use
> limited resources, it shouldn't cause any problems or raise flags. The
> one issue I am running up against is script max execution time.
>
> Is there any way to spawn another instance of the script and close the
> initial one, daisy chaining execution every few minutes to stay under
> the limits? I'm unclear whether options like popen() a) work in shared
> environments and b) allow the process which spawns another to be shut
> down.
>
> If anyone has any expertise here I'd absolutely love some direction!
>
> Thanks.
for those following:
exec("php -f /var/www* filename here > /dev/null &", $array);
spawns it as a background process (on linux at least) and the script that
spawned it is closed.
Navigation:
[Reply to this message]
|