You are here: Re: [PHP] network speed « PHP « IT news, forums, messages
Re: [PHP] network speed

Posted by Brent Baisley on 10/14/05 23:57

Once you involve the network, there are all sorts of delays that can
crop up. Each network hop is going to add a bit of overhead unless
every single step along the way has high end routers that can route
at line speed. Otherwise the routers are doing a store and forward,
which means they wait until the whole packets arrives, analyze where
it needs to go, then sends it out. This happens very quickly, but say
it take 2 ms. Five non-highend routers with add .1 seconds, each way.
And that's with zero packet loss. Unless your server is hosted on a
tier network, your biggest problem will be latency. You can do a
trace route to find out how many hops you are away from the other
server, and maybe even tell where the biggest delay is.

Read this article to get an understanding of what effect a network
and your geographical location can have on your website performance.
http://www.samag.com/documents/s=9894/sam0511a/0511a.htm

On Oct 14, 2005, at 4:03 PM, Richard Lynch wrote:

> I've been spinning my wheels for weeks now on this, so am turning to
> the geniuses...
>
> My code has/had various combinations of:
> file_get_contents()
> fopen/fread
> fsockopen/fread
> to suck down some XML from a search engine feed
>
> The feed runs on Windows in .NET and I think it's written in C#.
>
> None of which SHOULD matter, but...
>
> So, here's the problem.
>
> file_get_contents is taking about 7-9 seconds to run.
> The vendor claims they can get results in 4-6 seconds.
>
> Somewhere, somehow, I'm losing 3 seconds of time, just in slurping
> down this XML file.
>
> This is not good.
>
> This is completely independent of processing the XML, displaying the
> results, etc. Which takes about 0.8 seconds, usually.
>
> Actually, there's an occasional 3-second "spike" in XML processing --
> not tied to any particular search term nor in any pattern I can
> find...
> But that's, hopefully, irrelevant.
>
> I've tried the following:
> time wget [URL]
> surf to [URL]
> running a PHP bench on the Windows server (local to XML engine)
> surfing to [URL] on the Windows server
>
> Nothing I do seems to make much difference, though the tests on the
> Windows box are a second or so "faster" than the remote.
>
> These tests have all been too ad hoc to have a nice chart of numbers
> or anything pretty for you to look at... So far.
>
> The one sticking point is that another site, using the same feed, is
> faster than we are, though also not as fast as the feed vendor says it
> should be.
>
> I can understand that file_get_contents is going to add SOME overhead,
> but 3 seconds sounds a bit "too much"
>
> Is it just me?
>
> Any ideas where 3 seconds could be taken up, just in
> file_get_contents?
>
> Is it just that the Linux box and Windows box don't like each other?
>
> --
> Like Music?
> http://l-i-e.com/artists.htm
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>
>

--
Brent Baisley
Systems Architect
Landover Associates, Inc.
Search & Advisory Services for Advanced Technology Environments
p: 212.759.6400/800.759.0577

 

Navigation:

[Reply to this message]


Удаленная работа для программистов  •  Как заработать на Google AdSense  •  England, UK  •  статьи на английском  •  PHP MySQL CMS Apache Oscommerce  •  Online Business Knowledge Base  •  DVD MP3 AVI MP4 players codecs conversion help
Home  •  Search  •  Site Map  •  Set as Homepage  •  Add to Favourites

Copyright © 2005-2006 Powered by Custom PHP Programming

Сайт изготовлен в Студии Валентина Петручека
изготовление и поддержка веб-сайтов, разработка программного обеспечения, поисковая оптимизация