|
Posted by Alex Kuznetsov on 12/08/06 03:23
meeraguna@gmail.com wrote:
> We have 20 -30 normalized tables in our dartabase . Also we have 4
> tables where we store the calculated data fron those normalised tables.
> The Reason we have these 4 denormalised tables is when we try to do
> the calcultion on the fly, our site becomes very slow. So We have
> precalculated and stored it in 4 tables.
>
> The Process we use to do the precalcultion, will get do the
> calculation and and store it in a temp table. It will compare the the
> temp with denormalised tables and insert new rows , delte the old one
> ans update if any changes.This process take about 20 mins - 60mins. It
> takes long time because in this process we first do the calculation
> regardless of changes and then do a compare to see what are changed and
> remove if any rows are deleted, and insert new rowsand update the
> changes.
>
>
> Now we like to capture the rows/columns changed in the normalised
> tables and do only those chages to the denormalised table , which we
> are hoping will reduce the processing time by atleast 50%
>
>
> WE have upgraded to SQL SERVER 2005.So We like to use the new
> technology for this process.
> I have to design the a model to capture the changes and updated only
> those changes.
>
> I have the list of normalised tables and te columns which will affect
> the end results.
>
>
> I thought of using Triggers or OUTPUT clause to capture the changes.
> Please help me with the any ideas how to design the new process
We do not do much of this denormalization stuff manually these days -
covering indexes and indexed views usually boost performance well
enough and they do it automatically...
Navigation:
[Reply to this message]
|