|
Posted by Rauch Christian on 09/19/05 14:51
Kimmo Laine schrieb:
> "Shawn Wilson" <firstinitial_lastname@dvigroup.net> wrote in message
> news:O8sXe.95208$0I4.61655@fe05.news.easynews.com...
>
>>"Shawn Wilson" <firstinitial_lastname@dvigroup.net> wrote in message
>>news:jxrXe.122430$e95.105593@fe08.news.easynews.com...
>>
>>>Is there any way for me to get past a server setting of 10 seconds for
>>>PHPs maximum execution time?
>>>
>>>I've tried the [set_time_limit()] command and it does nothing. I set it
>>>to 100 first, then 30000, still cuts me off at 10 seconds saying I've
>>>exceeded the time limit.
>>>
>>>What I'm doing is stepping through a list of images in one directory and
>>>copy/resizing them to another. Right now I've got 50 images, but it
>>>could be several hundred.
>>>
>>>I was thinking that the only way I could get around the time limit is
>>>maybe to make a list of files, write that list to a text file (or cookie
>>>I guess), and redirect back to the page over and over using that file to
>>>keep track of where I'm at in the list.
>>>
>>>Any other ideas? I think my idea is rather dirty and I'd like a smarter
>>>way to do this.
>>>
>>>Thanks!
>>
>>After a little more searching, I found that my server is not set to php's
>>safe mode, but my placement of the set_time_limit() command was incorrect.
>>
>>I needed to put that command inside my loop so it reset with each
>>execution of the loop. I thought that I could use that command once at
>>the top of the script with a 5 min timeout maybe and be done with it, but
>>it needs to be inside the loop so it resets as long as the script is
>>running.
>>
>
>
>
> You might also try to set it to infinite:
> set_time_limit(0);
>
> 0 as the parameter sets the execution time to infinite. See if that helps.
>
as he already wrote before, he had no luck with set_time_limit.
One option could be, that he makes his backup on one table, then makes a
redirect with a variable, which table comes next and so on.
This will only work, if there is no table to big for the time limit!
I have not done this, but it could work. After backing up all tables,
zip them all into one file. This could work, as the time consuming work
for backing up from mysql is already done.
You can find several classes for zip at php-classes.org eg
http://www.phpclasses.org/browse/package/2322.html and others!
All this is untested, but the idea should work!
hth,
rauch
[Back to original message]
|