|
Posted by David Portas on 12/08/06 23:05
Boot2TheHead wrote:
> This one cost me a solid half hour yesterday. I'm wondering why on
> earth the default precision for a decimal type is 18,0. Maybe I'm
> mistaken. A decimal datatype sort of implies that you'd want something
> after the decimal!
>
> Question is, can I set this database-wide? Like all new decimal
> datatypes have a precision of 12,6 or something like that? I haven't
> seen anything about this in the googling I have done...
I think the real question is, why would you NOT want to specify the
precision and scale when you use DECIMAL? If it is really important to
you NOT to specify the precision and scale then create a user-defined
type for it (not something I would generally recommend though).
--
David Portas, SQL Server MVP
Whenever possible please post enough code to reproduce your problem.
Including CREATE TABLE and INSERT statements usually helps.
State what version of SQL Server you are using and specify the content
of any error messages.
SQL Server Books Online:
http://msdn2.microsoft.com/library/ms130214(en-US,SQL.90).aspx
--
[Back to original message]
|