|
Posted by Hugo Kornelis on 01/30/08 21:52
On Wed, 30 Jan 2008 13:18:59 -0800 (PST), lee.richmond wrote:
>Hi,
>
>I have a sql query like this
>
>select avg([mycolumn]) from data where date > '1/5/08' and date <
>'1/10/08'
>group by [mycolumn]
>order by [mycolumn] desc
>
>If all values within that average are numbers, I'm fine. If it's a 0
>(not a null, a 0) it doesn't get averaged in. For instance, values
>0,1,2 should produce an average of 1.
>
>(0+1+2)/3 = 1.
>
> But sql is returning a value as if my 0's were nulls and not factored
>in:
>
> (1+2)/2 = 1.5
>
>Does anyone know why this is happening and how to fix it?
Hi Lee,
I was unable to reproduce this behaviour. Can you post some code (i.e. a
full repro script: CREATE TABLE statements, INSERT statements, and the
offending query) that I can run on my test server that does show this
behaviour on your machine?
I suspect something else is biting you, but I have to see a repro to
find out what it is.
--
Hugo Kornelis, SQL Server MVP
My SQL Server blog: http://sqlblog.com/blogs/hugo_kornelis
Navigation:
[Reply to this message]
|