|
Posted by Bob Barrows [MVP] on 01/18/07 12:20
wizofaus@hotmail.com wrote:
> Not sure, but now I have another issue which does seem to point
> towards parameters being a problem.
>
> If I run a query with quite a large number of parameters which
> basically
>
> SELECT Count(*) FROM MyTable WHERE Key1 IN (@0, @1, @2...@12) AND Key2
> IN (@13, @14, @15...@20)
>
> it takes over 2 seconds, but when I substitute the last 8 parameters
> (13-20) with their values, i.e.
>
> SELECT Count(*) FROM MyTable WHERE Key1 IN (@0, @1, @2...@12) AND Key2
> IN ('value1', 'value2', 'value3', 'value4'...'value8')
>
> the query takes 15 milliseconds! (actually it took slightly longer -
> about 400msec - on the first run - the previous query always takes ~2
> seconds no matter how often I run it).
Now you've got me really intrigued. Based on everything I've ever read, IN
comparisons are supposed to be non-sargable, and therefore non-optimizable.
Someone from the SQL Server groups please correct me if I am wrong. Oh wait,
maybe you have a sufficient number of values to cause the query engine to
use a temp table and join, in which case optimization can occur.
I suggest you use SQL Profiler to determne the difference in the execution
plans for each method. If the fast query plan involves an index that is not
used in the slow query plan, you can use an index hint to force the query
engine to use that index
(http://www.sql-server-performance.com/transact_sql_where.asp)
>
> Curiously, the fewer parameters I substitute for the Key2 filter, the
> slower it runs (I can't improve on the 15 msec by doing the same for
> they Key1 filter), in a more or less linear relationship.
>
> I've tried all the previously mentioned "tricks", including reindexing
> the table, but no luck.
What about the trick that involves using local variables in your batch,
instead of directly using the parameters:
declare @tmp1, @tmp2, etc.
set @tmp1=@parm1
etc.
SELECT ...(@tmp1, ...)
Personally, I would do this in a stored procedure, but you have ignored
previous advice to use stored procedures.
>
> So it seems maybe SQL server has problems optimizing if there are too
> many parameters.
> FWIW, I did try putting the parameter values in a temporary table and
> even using a join, but the total time is significantly longer than
> 15ms.
ISTR reading somewhere that with a sufficient number of values, that the
query engine does this anyways behind the scenes.
>
> Again, the parameter values are under my control, so there's no risk
> of SQL injection,
I always cringe when I hear somebody say this. Unless those values are
hard-coded into your application code, you have to be getting those values
from somewhere. If a user was involved at any point in the process that
generates those values, then you need to at least entertain the possibility
that some funny business may have occurred. Look in these articles for
"secondary sql injection". It is always a mistake to assume that your user
base is too ignorant to take advantage of these techniques:
http://mvp.unixwiz.net/techtips/sql-injection.html
http://www.sqlsecurity.com/DesktopDefault.aspx?tabid=23
http://www.nextgenss.com/papers/advanced_sql_injection.pdf
http://www.nextgenss.com/papers/more_advanced_sql_injection.pdf
The bottom line may turn out to be that you need to choose between secure
and fast.
--
Microsoft MVP - ASP/ASP.NET
Please reply to the newsgroup. This email account is my spam trap so I
don't check it very often. If you must reply off-line, then remove the
"NO SPAM"
Navigation:
[Reply to this message]
|