|
Posted by "Richard Lynch" on 10/14/05 23:03
I've been spinning my wheels for weeks now on this, so am turning to
the geniuses...
My code has/had various combinations of:
file_get_contents()
fopen/fread
fsockopen/fread
to suck down some XML from a search engine feed
The feed runs on Windows in .NET and I think it's written in C#.
None of which SHOULD matter, but...
So, here's the problem.
file_get_contents is taking about 7-9 seconds to run.
The vendor claims they can get results in 4-6 seconds.
Somewhere, somehow, I'm losing 3 seconds of time, just in slurping
down this XML file.
This is not good.
This is completely independent of processing the XML, displaying the
results, etc. Which takes about 0.8 seconds, usually.
Actually, there's an occasional 3-second "spike" in XML processing --
not tied to any particular search term nor in any pattern I can
find...
But that's, hopefully, irrelevant.
I've tried the following:
time wget [URL]
surf to [URL]
running a PHP bench on the Windows server (local to XML engine)
surfing to [URL] on the Windows server
Nothing I do seems to make much difference, though the tests on the
Windows box are a second or so "faster" than the remote.
These tests have all been too ad hoc to have a nice chart of numbers
or anything pretty for you to look at... So far.
The one sticking point is that another site, using the same feed, is
faster than we are, though also not as fast as the feed vendor says it
should be.
I can understand that file_get_contents is going to add SOME overhead,
but 3 seconds sounds a bit "too much"
Is it just me?
Any ideas where 3 seconds could be taken up, just in file_get_contents?
Is it just that the Linux box and Windows box don't like each other?
--
Like Music?
http://l-i-e.com/artists.htm
[Back to original message]
|