Reply to Re: application level database connection

Your name:

Reply:


Posted by Jerry Stuckle on 11/17/06 12:13

rich wrote:
> I have a PHP5 application that accepts external messages, and inserts
> them into a database. It's rather high traffic.. the server could be
> receiving thousands at a time, and recently mysql has been locking up
> because it's exceeding the max_connections limit. I raised it for now,
> but that's only a temporary fix.
>
> My question is is there a way to make a connection to mysql that
> multiple instances of an object will use together? I'm under the
> impression that something like a singleton would only live on a per
> instance basis, or am I incorrect?
>

That's a lot of messages to be handling. How are they getting to your
server - via web pages, a socket, or some other means? That will make a
difference.

From the MySQL end, you should start by ensuring you're using good
programming practices. Things like closing the database connection as
soon as you're through with it, rather than waiting until the end of the
script (or even worse, depending on the garbage collector to do it for you.

If you have SELECTs running concurrently, especially with INNODB tables,
you may be running into locking issues. This will delay requests,
resulting in a larger number of concurrent connections.

Also, the more indexes you have on your table(s), the longer INSERT
operations will take. Check out your indexes and get rid of any you
don't need.

The recommendation of spooling to the file system can be awkward at
best. You have to ensure you lock the file before writing to it by
anyone or you can end up with a corrupted file.

Personally, I like writing data to a "scratch" table in MySQL. This
would be a MyISAM table with no indexes. It may only have two columns -
an id and unparsed data, or, if the data are always the same, it could
have several columns.

The idea is to have a table that the application can insert data into as
quickly as possible.

Then I have a batch script fetch the unparsed info from this table and
inserts it into the appropriate tables. Depending on how busy the
system is, how much data is being spooled, how long it takes to parse
the data, etc., I might fetch one row at a time - or I might fetch a
dozen, read them all into an array then process them.

The only think you need to watch out for is that you don't exceed the
processing time limit for the PHP script.

--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================

[Back to original message]


Удаленная работа для программистов  •  Как заработать на Google AdSense  •  England, UK  •  статьи на английском  •  PHP MySQL CMS Apache Oscommerce  •  Online Business Knowledge Base  •  DVD MP3 AVI MP4 players codecs conversion help
Home  •  Search  •  Site Map  •  Set as Homepage  •  Add to Favourites

Copyright © 2005-2006 Powered by Custom PHP Programming

Сайт изготовлен в Студии Валентина Петручека
изготовление и поддержка веб-сайтов, разработка программного обеспечения, поисковая оптимизация