|
Posted by The Natural Philosopher on 12/05/07 13:02
Mikhail Kovalev wrote:
> I unserialize an 8MB file, which contains the db array, and put the
> new element there (then serialize the whole thing again and store)
>
I am far from understanding what you are actually doing, but it sounds
like you are willing to trade disk usage for speed and memory usage.
Mysql - and any dB program - allows you to insert records randomly.
It maintains its own internal indices..so sorting and ordering is never
done at the raw data level, but by a data abstraction when you remove
all or part of the data for use, or alteration.
To walk an N level tree will of course take N accesses to the mysqld,
but I hardly think that would take anything like the 10-20 seconds you
are currently experiencing.
Mysql is, actually, once installed, a very usable and simple interface
to program PHP against.
Any other alternative would likely as not involve you in re-inventing
the database wheel,. and one has to say, why bother?
I would strongly advise biting the bullet and installing MYSQL, and then
coming back to get help on how to implement your curious requirements
against it.
> On 5 Des, 13:28, Jerry Stuckle <jstuck...@attglobal.net> wrote:
>> Mikhail Kovalev wrote:
>>> Hi.
>>> I work with recursive array trees of 10-20 levels in PHP. So far I
>>> have been using serialize() to store the arrays, generating files 5+
>>> MB large, which take 10-15 seconds to unserialize and about tenfold of
>>> RAM! I've been adviced to use MySQL (relational db's), but are there
>>> any other options beside that and var_export/include? Something that
>>> works in a similar way as MySQL when adding a new element without
>>> loading the whole database itself...
>>> Thanks!
>> What do you mean by "without loading the whole database itself"?
>>
>> --
>> ==================
>> Remove the "x" from my email address
>> Jerry Stuckle
>> JDS Computer Training Corp.
>> jstuck...@attglobal.net
>> ==================
>
[Back to original message]
|