| 
 Posted by Bill on 06/11/96 11:32 
Sorry re-posted as my email setting were wrong on the last post 
 
 
 
I wonder if anyone can help. 
 
I have a scheduled job running overnight to delete old records for a 
particular Db table.  
 
The table contains more than half million records and the script 
simply uses the date field to delete any of the records which have a 
date older than 7 days. My guess is that there will be some 100,000 
records which need to be deleted. 
 
The job takes ages to run and whilst it is running sql server is 
completely locked and the associated website is effectively offline 
because of it. 
 
I think I know where the problem lies but I don't know why. 
 
Whoever created the Db table created a varchar 255 field as the 
primary key. The field is then filled with a string of 40 characters 
generated within the webserver script to have a 'unique' string. 
 
Those are the facts. The following is my interpretation. 
 
I think the server is unable to cope with not only a character field 
as primary key but also one with so many characters in it. In such a 
larger table presumably in order to delete the old records it must do 
some sort of ordered sort on the Primary Key and it is this that is 
causing the problem. 
 
I would like to introduce a new field called 'id' make it autonumbered 
and primary and make the errant field non-primary. 
 
So my question is this:  
 
Is my analysis correct but more importantly, why? Can anyone give me 
clear reasoning for it. 
 
Also is the solution sound? 
 
 
Table looks like this: 
 
clientID int 
refID varchar 255 Primary Key 
fieldA varchar 512 
creationDate datetime 
fieldB varchar 255 
field C varchar 32 
 
Job script: 
 
delete from myTable where creationDate < [7daysAgo] 
 
 
 
Thanks in anticipation 
 
Bill
 
  
Navigation:
[Reply to this message] 
 |