|
Posted by Bonge Boo! on 05/10/05 11:35
Hope someone can clarify this. I have defined a table with some columns that
are to carry pricing data as decimal numbers.
My table is as follows:
CREATE TABLE `bikes` (
`id` int(11) default NULL,
`tyresize` varchar(50) NOT NULL default '',
`manufacturer` varchar(50) NOT NULL default '',
`model` varchar(100) NOT NULL default '',
`retail` decimal(4,2) NOT NULL default '0.00',
`buyprice` decimal(4,2) NOT NULL default '0.00',
`sell_ex` decimal(4,2) NOT NULL default '0.00',
`sell_inc` decimal(4,2) NOT NULL default '0.00',
`rimsize` varchar(5) NOT NULL default ''
) TYPE=MyISAM;
However, when I import a load of tab-delimited data, I look through the
values and see that in a number of places the decimal column has values like
72.95999999999999
Now as I've specific the decimal precision I want, why the hell is that
happening? If I change the colum type to float the same thing happens.
Help!
[Back to original message]
|