|
Posted by Sebastiaan 'CrashandDie' Lauwers on 07/13/07 10:41
busnet wrote:
> I also thought about that, but it wont be that much faster, I'm
> afraid. A file still has to be read every time.
Not exactly.
It is possible to store a file in the RAM (copy would have to be saved
regularly on the disk so to not lose the contents if there was a
failure, but still).
What I mean is, that a good operating system, is going to cache the
files that are under heavy access. GNU/Linux can do this, if the kernel
is of a decent flavour. I recall the Debian one doing it without any
problems.
Also, hacking the operating system quite severely, I'm pretty sure it
should be experimentally possible to map the RAM, and allocate some
place you could mount, as such, you would have a very sharp and fast
memory, that would keep any kind of files stored for at least some time,
and reload it, if it hasn't been accessed in a given amount of time.
A last solution, I guess, would be to fork () the application, and keep
one part of the application running in the background, keeping all the
variables and objects in RAM, and trying to get those values back, one
way or another.
Doing this like this, you might in fact be bothered, as PHP has a max
execution time of 45 seconds (default value), and that isn't
specifically *long* ! As such, it might just be easier to run a daemon
in whatever language, (be it PHP, C, C++), and that would act as a
server, you could query easily. The daemon could be started up by the
PHP script, if it is run, and there is no answer, a small exec () and
there you go...
These are all pretty random ideas, I can try to elaborate one of these
if you are interested.
> Thanks anyway.
HTH,
S.
[Back to original message]
|