Posted by Ben on 03/08/06 22:52
We are planning to add a new attribute to one of our tables to speed up
data access. Once the attribute is added, we will need to populate
that attribute for each of the records in the table.
Since the table in question is very large, the update statement is
taking a considerable amount of time. From reading through old posts
and Books Online, it looks like one of the big things slowing down the
update is writing to the transaction log.
I have found mention to "truncate log on checkpoint" and using "SET
ROWCOUNT" to limit the number of rows updated at once. Or "dump
transaction databaseName with No_Log".
Does anyone have any opinions on these tactics? Please let me know if
you want more information about the situation in order to provide an
answer!
[Back to original message]
|