|
Posted by CptDondo on 12/06/06 17:09
Erwin Moller wrote:
> CptDondo wrote:
>
>
>>I am working on an embedded system. The entire configuration for the
>>system is stored in an XML file, which is pretty long.
>>
>>It takes about 3 seconds to open the file using domxml_open_file.
>>
>>Breaking the file into smaller files is not possible; a single XML file
>>is a part of the design requirement.
>>
>>Right now we're opening and freeing the file every time any data is
>>requested from the file, which is quite often. This means that the user
>>ends up waiting about 5 seconds total before the page is generated.
>>
>>It would be nice if we could open the file once per session, and keep
>>the file open throughout the session, rereading only when the file on
>>disk changes.
>>
>>Is this possible?
>
>
> Hi Yan,
>
> Yes, this is possible, but I seriously doubt it will increase performance.
> Say you have a 1 MB file.
> You store it into the session like $_SESSION["hugestructure"] = <yourfile>.
>
> When the script end, the session is serialized and written to a sessionfile
> in some sessiondirectory.
>
> Next time that session is needed, the whole file must be read back into
> memory in $_SESSION["hugestructure"], even if you don't use it.
>
> I expect the overhead of serializing the file and safing it to disk takes
> even longer than just opening it when needed.
Well, I was hoping for some sort of magical server-side caching where I
could stash the $dom.
Passing it back and forth is not practical; we have a 11MBps network
that serves 600 nodes; so bandwidth is a *huge* concern. We're already
using a compressed protocol to communicate and gateways and relays to
minimize the impact of broadcasts.
>
> Personally I would rethink the design of the application. Do you really need
> such a huge file so often? Which information is used? Can you translate the
> file to a few tables in a database and just query what you need when you
> need it?
The XML file itself is 300K; not really huge by modern standards, and on
a normal server it would not be an issue. Alas, I am working with a
200MHz embedded box with 32 MB RAM, so we're trying to squeeze as much
as we can out of it.
We open the file once per page load to read configuration and data that
pre-fills a form; and then possibly save any changes that the user has made.
A single human readable file with all of the information is a *huge*
benefit for our customers; something we're not likely to give up. I
think all in all I'd rather have this particular customer wait a bit.
>
> If this is no option for you, you might try a 'shared memory' approach in
> PHP.
> Here is more info:
> http://nl3.php.net/manual/en/ref.shmop.php
I may follow up on that in V2. :-) It looks interesting; I don't know
if we can shove a $dom in there and retrieve it.
--Yan
Navigation:
[Reply to this message]
|