You are here: Re: Reducing load for LAMP app? « PHP Programming Language « IT news, forums, messages
Re: Reducing load for LAMP app?

Posted by The Natural Philosopher on 01/04/08 11:28

_q_u_a_m_i_s's wrote:
> On Jan 4, 10:34 am, Gilles Ganault <nos...@nospam.com> wrote:
>> Hello,
>>
>> I'm no LAMP expert, and a friend of mine is running a site which is a
>> bit overloaded. Before upgrading, he'd like to make sure there's no
>> easy way to improve efficiency.
>>
>> A couple of things:
>> - MySQL : as much as possible, he keeps query results in RAM, but
>> apparently, each is session-specific, which means that results can't
>> be shared with other users.
>> Is there something that can be done in that area, ie. keep the maximum
>> amount of MySQL data in RAM, to avoid users (both logged-on and
>> guests) hitting the single MySQL server again and again?
>
> Are you sure keeping data in ram is a good solution? MySQL cache
> dosen`t help?
>

Its all held in ram by disk caching anyway.

Probably :-)

If you are running queries against the same sets of tables time and
again, those tables will be cached by the OS.

If the computer runs out of free RAM tho, expect a sudden and huge
downturn in performance.

That something for the machine owner to fix tho.

The other thing that sometimes screw up dB apps is searches on
un-indexed fields. Adding indexes can often help a lot, as can
optimising the way nested select statements are done.

The key is that the earliest you can reduce the amount of data, if
possible searching on indexed fields first..

>> - His hoster says that Apache server is under significant load. At
>> this point, I don't have more details, but generally speaking, what
>> are the well-know ways to optimize PHP apps?
>>
>> Thank you.
>
> For an easy test try to install a PHP accelerator (i`ve testesd
> eaccelerator http://eaccelerator.net/ ). It can do a pretty good job.
> And another thing , try to make sure that the server is overloaded
> procesing stuff and not because of a bad caching mechanism (google for
> browser-caching). As a rule of thumb , do not use php-generated
> images, or other dinamically generated content if it`s not really
> needed.
>

Its a hard call to know what is slow without access to the machine. It
may be I/O bound due to low RAM and too many disk access, process bound
- too many processes for the RAM avaialable - or simply CPU bound having
to do too much computation. My experience suggests that mostly you don't
need huge CPU power in a *server*, but a fast disk or better, several
disks, and huge amounts of RAM are the key. As is a fast network..only
in some of the nastier SQL queries does CPU power get an issue, and
those are generally fixed by rewriting te query or indexing.

Any on the fly graphics computation will of course screw the CPU, but
most people don't do that..its more likely the images are just being
retrieved, not generated...

>
> There is no easy way to optimize PHP,as with any other language you
> have to know exactly what goes wrong with your scripts. You can use
> xdebug (as a profiler) to see what functions eat a lot of resources.
> You can monitor the server and see why he is slowing down like
> that(need more ram, more cpu`s...more hdd?).

That's an interesting point..can PHP easily get at things like CPU
utilisation, memory resoure allocation and the like? so one could build
a web page to peer into the server?

I wouldn't mind one of those here..

 

Navigation:

[Reply to this message]


Удаленная работа для программистов  •  Как заработать на Google AdSense  •  England, UK  •  статьи на английском  •  PHP MySQL CMS Apache Oscommerce  •  Online Business Knowledge Base  •  DVD MP3 AVI MP4 players codecs conversion help
Home  •  Search  •  Site Map  •  Set as Homepage  •  Add to Favourites

Copyright © 2005-2006 Powered by Custom PHP Programming

Сайт изготовлен в Студии Валентина Петручека
изготовление и поддержка веб-сайтов, разработка программного обеспечения, поисковая оптимизация