Posted by Brian - Support on 12/04/07 19:56
A handful of tables with 10 - 20 million records.
For one thing, we're having to do update statitistics quite frequently
or performance slows down, and the update statistics is taking 3-4
minutes for each table (at 2% sampling) and using quite a bit of CPU
when it runs.
Also, we would put the larger "archive" tables in a seperate filegroup
so that we can do frequent backups of the smaller tables for quicker
emergency restores.
[Back to original message]
|