|
Posted by kiretose on 11/14/07 20:34
Hey guys,
Currently designing a web crawler of sorts to download some XML feeds and
parse the data, was hoping to get it working in shared hosting
environments. Because it only scans slowly and is designed to use limited
resources, it shouldn't cause any problems or raise flags. The one issue
I am running up against is script max execution time.
Is there any way to spawn another instance of the script and close the
initial one, daisy chaining execution every few minutes to stay under the
limits? I'm unclear whether options like popen() a) work in shared
environments and b) allow the process which spawns another to be shut
down.
If anyone has any expertise here I'd absolutely love some direction!
Thanks.
Navigation:
[Reply to this message]
|