Reply to Re: Database Optimization

Your name:

Reply:


Posted by Erwin Moller on 09/15/05 17:25

nospammmer@gmail.com wrote:

> Hello group,
>
> I have a rather general but interesting inquiry that is related to PHP
> and I hope this is the appropriate place to post it.
>
> I'm looking for a way to improve dramatically the performance of my PHP
> application. The application is getting slow as it is taking more load.
> It is performing a very high number of queries to a database, and I
> believe that this is taking up most of the ressources.
>
> I'm trying to figure out how I could cache the content of the database
> and prevent that many queries to be performed.
>
> What I would like to do is cache all the content of the database in
> memory, so that I could access it directly through my PHP application
> without querying the database and saving precious ressources.
>
> The database is quite small, 15 - 20 mB and it's size is constant (it
> does not get bigger over time). 92% of the queries are SELECT, only 8
> percents are UPDATE, DELETE and INSERT.
>
> So, my question is, is it possible and recommandable to place 20mB of
> data in shared memory in order to prevent queries to the database? (all
> updates, deletes and inserts are performed both in the database as well
> as in memory)
>
> Or would I be better to place a copy of the database on a ramdrive?
>
> Other info:
> I have a server which runs both the PHP application and the MySQL
> database. It has 1GB of RAM. The database receives 250 queries / sec.
>
> Thank you in advance for your kind help


Hi,

I am unsure if placing the database in memory will seriously increase it's
performance: you'll have to test that.

If you database is using its time on scanning tables and joining, and
conversions, etc etc, the time trimmed off could be disappointing.
If the database is non-stop reading files, it could help.
Hard to say.
Which database do you use?


(!)But before you go all that way, did you try some more 'old-fashioned'
optimizations?

Some ideas:
- Try to figure out which tables are scanned a lot, and place indexes on the
relevant column(s).
(If you use Postgresql, try EXPLAIN-command for help)

- Does you DB and your code use Prepared statements?
They can help a lot, especially when the queries are complex.

- If 50 of the 250 queries/sec are the same selects that don't change, you
could try some smart caching.
eg: If a popular query is to get the latest 20 this-or-that, with all kind
of joins on other tables, you could shedule that query every 15 minutes,
and safe the results in a file. Then include the file on the pages where
you need it.
Alternatively: you could just update the file, whenever you know a relevant
table is changed.
(What makes the most sense is up to you to decide of course.)

This kind of optimalization can make huge differences.

In general: Try to figure out which queries are executed a lot, and start
there with prepared statements/indexing/caching-to-file.

Hope this helps.

Good luck!

Regards,
Erwin Moller

[Back to original message]


Удаленная работа для программистов  •  Как заработать на Google AdSense  •  England, UK  •  статьи на английском  •  PHP MySQL CMS Apache Oscommerce  •  Online Business Knowledge Base  •  DVD MP3 AVI MP4 players codecs conversion help
Home  •  Search  •  Site Map  •  Set as Homepage  •  Add to Favourites

Copyright © 2005-2006 Powered by Custom PHP Programming

Сайт изготовлен в Студии Валентина Петручека
изготовление и поддержка веб-сайтов, разработка программного обеспечения, поисковая оптимизация