|  | Posted by Tony Rogerson on 08/05/07 08:32 
> about portable code, how people should code, etc.
 Why sacrifise maintainability, support costs, development costs, simplicity
 of design for something nobody wants? Nobody is interested in portability.
 
 > the repeated code is easy to maintain with a text edit, but let's talk
 > about portable code, how people should code, etc.
 
 Not it isn't, have you tried? Cut and paste problems; how long will it take
 you to type all those parameters and test each possible parameter
 combination to make sure you got it right? Answer - a long time.
 
 > 1) most people will not type in more than 10 to 25 parameters and they
 > *seldom* need to.
 
 Are you sure about that? Why are you suggesting people can use 1,000
 parameters then - you have made a general design advisary statement.
 
 >When an input list gets long, you need to load a
 > table and scrub the data before you invoke the procedure.
 
 Oh yes, the other scalability and concurrency weakness in the design you
 propose. It doesn't scale.
 
 > 2) what code do "CSV spliter" people write to hande strings like
 > '1,,,4',  or 'NULL, NULL, 2' or '1, ''2.34'', 7' or whatever?  there
 > is no parsing or error handling and therefore no data integrity.
 
 Who mentioned CSV splitter; there is no need with dynamic SQL.
 
 If you wanted to you could even, using dynamic SQL, take the CSV parameter
 input and populate a table thus....
 
 That gives you the full referential integrity checking for the values
 passed.
 
 Nothing complicated about the stuff below; no procedural loops etc...
 
 create proc csv_eg
 @csv    varchar(500)
 as
 begin
 --  check we have commas correct
 set @csv = ltrim(rtrim(@csv))
 
 if left( @csv, 1 ) <> ','
 set @csv = ',' + @csv
 if right( @csv, 1 ) = ','
 set @csv = @csv + 'NULL'
 
 set @csv = replace( @csv, ',,', ',NULL,' )
 set @csv = replace( @csv, ',,', ',NULL,' )  --  captures rest of ,, not
 caught in first one
 
 --  done.
 
 set @csv = replace( @csv, '''', '''''' )    --  get rid of injection
 attempt
 
 declare @sql varchar(max)
 
 set @sql = replace( @csv, ',', char(13) + char(10) + 'insert #csv_split
 ( csv_value ) values( ' )
 set @sql = replace( @sql, char(13), ' )' + char(13) )
 set @sql = right( @sql, len( @sql ) - 4 ) + ' )'
 
 create table #csv_split (
 csv_value   int null check( csv_value between 10 and 20 )
 )
 
 insert #csv_split
 exec( @sql )
 
 select *
 from #csv_split
 
 end
 
 --    this works fine
 csv_eg ',11,12,13,14,,,,,,,,,,,,,,,'
 
 --  these fail correctly because the data passed is not valid
 csv_eg ',11,12,13,14,1,,,,,,,,,,,,,,'
 csv_eg ',11,12,13,14,''ABC'',,,,,,,,,,,,,,'
 
 
 How would you validate each one of those parameters? You would either have a
 lot of IF ELSE statements, but that would be procedural and we all know your
 view on using IF ELSE... or you could populate a talbe in the proc like I've
 just done; only you'd have to do all the work manually, code all the
 inserts, code all the parameters, test all the combinations - it's dinosaur
 programming at best, something we'd do back in the 80's because languages
 weren't as powerful as they are today (2007).
 
 > unh? chunking will put related parameters on one line in a logical
 > contigous ordering.
 
 Have you done any real programming?
 
 You'd have 1,000 of these too....
 
 cmdSQL = New SqlCommand("ihs.admin_meta_region_save", oDBConn)
 cmdSQL.CommandType = CommandType.StoredProcedure
 
 cmdSQL.Parameters.Add(New SqlParameter("@P0001", tbRegionNew.Text))
 
 cmdSQL.Parameters.Add(New SqlParameter("@P0002", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 cmdSQL.Parameters.Add(New SqlParameter("@P....", tbRegionNew.Text))
 
 cmdSQL.ExecuteNonQuery()
 
 
 > if the code does not have to do validatiion and produce correct data,
 > who cares about a fast wrong answer?  I was one of the first people to
 > do a CSV scan in a single SQL query.  I wrote a kludge.  I repent of
 > my evil.
 
 Like I said earlier in this post with an example you can do data
 validation - full RI checking.
 
 Show me your example and I'll rip it to bits; no doubt it involves self
 joins, no data validation and string manipulation with substring.....
 
 --
 Tony Rogerson, SQL Server MVP
 http://sqlblogcasts.com/blogs/tonyrogerson
 [Ramblings from the field from a SQL consultant]
 http://sqlserverfaq.com
 [UK SQL User Community]
 
 
 "--CELKO--" <jcelko212@earthlink.net> wrote in message
 news:1186283386.198635.327190@w3g2000hsg.googlegroups.com...
 >>>  Comparing the CSV approach with the parameter approach below, which one
 >>> do you consider more maintainable and supportable? <<
 >
 > the repeated code is easy to maintain with a text edit, but let's talk
 > about portable code, how people should code, etc.
 >
 > 1) most people will not type in more than 10 to 25 parameters and they
 > *seldom* need to.  When an input list gets long, you need to load a
 > table and scrub the data before you invoke the procedure.
 >
 > 2) what code do "CSV spliter" people write to hande strings like
 > '1,,,4',  or 'NULL, NULL, 2' or '1, ''2.34'', 7' or whatever?  there
 > is no parsing or error handling and therefore no data integrity.
 >
 >>> Also consider the application will need one line per parameter used on
 >>> the stored procedure. <<
 >
 > unh? chunking will put related parameters on one line in a logical
 > contigous ordering.
 >
 >>> This which takes milliseconds to edit in Management Studio....<<
 >
 > if the code does not have to do validatiion and produce correct data,
 > who cares about a fast wrong answer?  I was one of the first people to
 > do a CSV scan in a single SQL query.  I wrote a kludge.  I repent of
 > my evil.
 >
 >
 >
  Navigation: [Reply to this message] |