|
Posted by tuco357 on 06/13/06 21:30
This problem has been vexing me for some time and I thought I should
consult the group....
Often times when writing a php script to handle some mysql DB
transactions, I must write code that performs, say, an insert into a
MySQL DB, then retrieves the last item's index, and makes a new
insertion into another table on the DB that requires the previously
obtained index. Both queries must be executed and completed - if query
one succeeds and query two fails, I roll back the changes of query one.
Obviously, there will be times when many interdependent queries are
involved - rolling back in such cases is a headache, albeit a necessary
one. We all know that a robust web system must have a lot of error
checking, and the intergrity of the databases must be protected at all
times.
Now, for the big problem that I cannot find an *eloquent* solution
to...
Suppose that in the original example, query one is executed and
completed. However, just before query two is made by the PHP script,
the whole darn server goes down. Thus, query one is complete, query
two never took place, and when the server is restarted, the database is
corrupt!
An obvious, but IMHO, clunky, solution, is to use a set of scripts that
can be run every few hours or days that go through the database and
verify that everything makes sense - e.g. there is no row in PROFILES
with a globally unique ID that cannot be found in the MEMBERS table (a
user has a profile but no basic account info in members). As
problems are discovered, they can be automatically corrected, or an
alert can be sent to an admin.
Obviously, my solution is the pits. What would you do?
[Back to original message]
|