Reply to Re: hitting the limits

Your name:

Reply:


Posted by Rik on 04/29/06 07:20

windandwaves wrote:
> I am managing a site, www.friars.co.nz that seems to be hitting the
> limits and I keep getting the 500 error. According to the people
> from webfarm it is because my script are too demanding

I'm no server admin, but maybe you can ask them what the most demanding
requests are?

> or not closed
> properly.
>
> I dont believe you have to "close" PHP scripts

Not to my knowledge, no.

> or even database
> connections.

Normally no.
If your resources are stretched, it may be worth it to check wether your
script opens once (and only once) a connection to the database per request,
extracts all needed data, and then closes the connection, after which is
continues further processing. If your website is that popular, maybe there
is something to gain from using a persistent database connection, that
highly depends on your used scripts and connections.

I might be talking out of my ass here, I've unfortunately never had the
problem of being so popular :-).

> The site also accesses a 80 Megabyte database.

Phah! That's huge compared to what I'm used to.
I assume you've normalized the database?
Created proper indexes for faster selecting?

> Do you know of a way I can find out where the problems are or
> improving the performance of the site?

To check how much time (and probably resource) your script takes, loop over
it a couple of times on a local server (somewhere in the 100 or more), and
on specific places in your code add the following:

At the start:
$start = microtime(true);

At key locations in the script you want to know the time taken.
$end = microtime(true);
$time_taken[$some_specific_name] += $end-$start;
$start = $end;

print_r($time_taken) after a 100 or so loops will make it clear to you which
portions of your code take the most time, and check those specific portions
wether they can be done more effective.

If PHP < 5 use www.php.net's suggestion: create a function:
function microtime_float()
{
list($usec, $sec) = explode(" ", microtime());
return ((float)$usec + (float)$sec);
}
and replace microtime(true) with microtime_float().

> I uses ob_start and ob_end_flush, would that cause problems?

It increases the use of resources. When stretched it might be a problem,
normally it won't. I wouldn't think it's main cause of the problem, but it
will add to it. Is there a specific reason you NEED ob functions? If not:
don't use them.


To decrease server-load you could think about a cache system: review the
dependancies of databasefields for each of your pages, create appropriate
timestamps with an index in the database if they don't exist already, and
cache the pages locally as html with a date/time. On a request, check wether
there is a timestamp in the database higher than of your cached html If not,
serve the cached file, if so, create new cachefile and serve that. Creating
of cachefiles should offcourse take place automatically on an update of a
certain database field, but that might be even more work.

I hope I've been of some help,
--
Rik Wasmus

[Back to original message]


Удаленная работа для программистов  •  Как заработать на Google AdSense  •  England, UK  •  статьи на английском  •  PHP MySQL CMS Apache Oscommerce  •  Online Business Knowledge Base  •  DVD MP3 AVI MP4 players codecs conversion help
Home  •  Search  •  Site Map  •  Set as Homepage  •  Add to Favourites

Copyright © 2005-2006 Powered by Custom PHP Programming

Сайт изготовлен в Студии Валентина Петручека
изготовление и поддержка веб-сайтов, разработка программного обеспечения, поисковая оптимизация