|
Posted by Hugo Kornelis on 11/03/05 00:54
On Wed, 2 Nov 2005 22:47:00 +0000 (UTC), Erland Sommarskog wrote:
>Hugo Kornelis (hugo@pe_NO_rFact.in_SPAM_fo) writes:
>> The nice thing about the "old" method was that the implicit conversion
>> of the constant enabled the optimizer to use an index that was defined
>> on the column; in SQL Server 2000, the implicit conversion of the column
>> would preclude the use of that index.
>
>But what happened if two columns of different data types met?
Hi Erland,
I'm sorry, but I don't know that.
I've learned what I posted when figuring out why queries that ran
smoothly on SQL 7 were going at snail pace after upgrading. Turned out
that the converting a column instead of converting the constant meant
that a table scan was chosen instead of an index seek.
I've never witnessed similar problems for column to column comparisons,
so I don't know how they were executed.
>> Of course, the price one paid for the index use in the old version was
>> that the database didn't always do what you'd expect after perusing the
>> precedence rules.
>
>As I said, I basically slept over SQL 7, so missed the problem.
>
>Of course, in many cases, these problems could be avoided by not
>having implicit conversions at all.
I couldn't agree more!
Best, Hugo
--
(Remove _NO_ and _SPAM_ to get my e-mail address)
[Back to original message]
|