|
Posted by kobus.dp on 02/20/07 21:20
Here are spider bandwidth stats for the 19th
Googlebot (Google) 262.93 MB 19 Feb 2007 - 23:54
Inktomi Slurp 74.83 MB 19 Feb 2007 - 23:58
MSNBot 270.99 MB 19 Feb 2007 -
23:34
Unknown robot (identified by 'robot') 597.13 KB 19 Feb 2007 - 05:23
Thats 1.2+ Gig for the day?
Yet when I look at my overall summary for the 19th:
Day Number of visits Pages Hits Bandwidth
19 Feb 2007 287 24266 45669 181.29 MB
I smell a rat...
thx
C
On Feb 20, 4:25 pm, "Jukka K. Korpela" <jkorp...@cs.tut.fi> wrote:
> Scripsit Cinamon Thunder:
>
> > Google and other engines spider my website every single day and as
> > result generates QUITE A BIT of traffic.
>
> Do you mean the _spider visits_? Even if a spider visits your all pages
> every day, that's still just one HEAD request per page. If that's
> considerably more than normal traffic to your site, I'd be more worried
> about the low usage than the spiders.
>
> (I'm assuming that your server sends adequate Last-Modified headers so that
> spiders can just send a HEAD request and see that nothing has changed,
> instead of retrieving the actual content.)
>
> > <META NAME="revisit-after" CONTENT="7 days">
>
> Important search engines probably ignore that routinely, since there has
> been too much abuse.
>
> > Any advice would be appreciated!
>
> Please convince me that daily visits by spiders are a _problem_. Many
> webmasters would pay real money to achieve such a situation.
>
> --
> Jukka K. Korpela ("Yucca")http://www.cs.tut.fi/~jkorpela/
[Back to original message]
|