|
Posted by Michael Trausch on 09/28/99 11:42
Hello,
I've been searching around, but I'm probably not using the right
keywords or something... I'm trying to work with something that is in a
database and has a column (well, two) that hold UNIX time stamps. The
problem seems to be though, that when I try to use the timestamp as an
integer, it turns to 0. From debugging output that I put into the
application:
0 is snooze till time.
Array
(
[tid] => 12
[takesPlace] => 1142411580
[title] => asdfasdf
[snooze_until] => 1142474505
)
takesPlace and snooze_until are columns that hold UNIX timestamps. But
when I try to use it as a number, it becomes 0 (as you can see in the
first line of output). Now, unless I'm really off my rocker, it would
appear that 4,294,967,295 > 1,142,474,505. Since the largest 32-bit
number is 4,294,967,295 if unsigned, or 2,147,483,647 when signed, why
is a number that is a little more then half of that being made out to be
a zero?
Is this a bug that I'm not finding reported somewhere, or is this a
quirk that I can't find anything on?
This is running on a dual-CPU Intel 32-bit machine running Linux 2.6.9,
if that makes any sort of difference.
Thanks,
Mike
[Back to original message]
|