|
Posted by Boot2TheHead on 12/08/06 17:34
This one cost me a solid half hour yesterday. I'm wondering why on
earth the default precision for a decimal type is 18,0. Maybe I'm
mistaken. A decimal datatype sort of implies that you'd want something
after the decimal!
Question is, can I set this database-wide? Like all new decimal
datatypes have a precision of 12,6 or something like that? I haven't
seen anything about this in the googling I have done...
Navigation:
[Reply to this message]
|