Reply to Automatically Archiving a Large Table

Your name:

Reply:


Posted by Brian - Support on 12/03/07 23:43

I have an extremely large table that is starting to slow down
production. I want to keep it for historical queries, but want to have
it fast for daily production activity. The daily activity will never
access records older than a few days. One of the difficult things
about this table is there is a lot of foreign keys that reference the
table.

So, I'm thinking of doing this:

1) Keep a "live" archive of the table using INSERT, UPDATE, and DELETE
triggers.
2) Having a job that disables the DELETE triggers and will delete
records older than 1 week.
3) Of course, part of this is also deleting the records that reference
this table.
4) Doing the same thing, with other large tables including, of course,
the ones that reference this table.

It seems the trick, mostly, is to delete the records from various
tables in order, so that none of the foreign keys are left hanging.

Is this a reasonable approach? Is there a more straight forward/built-
in technique for doing this?

We examined partitioning and this does not seem to be a practical
approach because, amoung other things, some of the tables are self
referencing and these references would cross partition boundaries.

Thanks!

- Brian

[Back to original message]


Удаленная работа для программистов  •  Как заработать на Google AdSense  •  England, UK  •  статьи на английском  •  PHP MySQL CMS Apache Oscommerce  •  Online Business Knowledge Base  •  DVD MP3 AVI MP4 players codecs conversion help
Home  •  Search  •  Site Map  •  Set as Homepage  •  Add to Favourites

Copyright © 2005-2006 Powered by Custom PHP Programming

Сайт изготовлен в Студии Валентина Петручека
изготовление и поддержка веб-сайтов, разработка программного обеспечения, поисковая оптимизация