|
Posted by Erland Sommarskog on 10/01/57 11:43
(JayCallas@hotmail.com) writes:
> So when I get a new security master file, the set-based approach says
> to insert all the new common security data into the SecurityMaster
> table and then go back and and reprocess the new data to insert the
> option specific data. My dilemma is that I feel it is more "natural" to
> treat each row as a single entity.
That is a feeling that you should overcome. It's perfectly normal to
read the same row all over again. (Maybe one day we will get a multi-table
INSERT, but I am not holding my breath.)
> * As far as exception handling goes, why is it considered "normal" to
> want all the data or just some of it? What about the idea of getting
> "most" of the data in and then dealing with those "exceptions"? Is it
> considered "business logic" to say "I rather have the majority of the
> data as opposed to NONE of the data"?
Of course, sometimes that is the business rule: all or nothing. But
there are also lots of processes where it's OK that some data slips
through the crack, as long what is a unit is a unit. (You don't want
an option to be imported into SecurityMaster, but then not getting the
option-specific data in place. Then you rather lose it.)
In the case where you import data, validation errors may be anticipated,
and you could try to check for the most likely in advance. In the end
what matters is of course, if the performance for the solution you
have is acceptable or not.
--
Erland Sommarskog, SQL Server MVP, esquel@sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodinfo/previousversions/books.mspx
Navigation:
[Reply to this message]
|