|
Posted by Nel on 12/17/51 11:51
"Erwin Moller"
<since_humans_read_this_I_am_spammed_too_much@spamyourself.com> wrote in
message news:44a26152$0$31654$e4fe514c@news.xs4all.nl...
> Nel wrote:
>
>> I have a web site that seems to be getting hammered by lots of bots.
>>
>> Some genuine, some questionable. Some just users that seem to be
>> downloading the site en-masse. I have started using php & mysql to track
>> the IP, host, browser, proxy of all visitors and am trying to gauge how
>> many pages to serve within one hour before restricting access.
>>
>> What do you consider to be a reasonable number of visits per hour? Do
>> many bots rush through a site?
>>
>> I know this is a vague question, but at least you guys may have some
>> experience in this area. Before you say it I have tried to Google for
>> this - no luck at all.
>>
>> Is there a better way to block abuse of a site???
>>
>> Nel.
>
> Hi Nel,
>
> Why do you want to restrict access to your site?
> If your site is popular, I expect that the extra bandwidthbill of your ISP
> is a luxuryproblem.
>
> I also used bots to download a site on my local HD for offline browsing
> (this was before ADSL was cheap).
>
> So why do you want to restrict your visitors?
>
> just my 2 cent.
>
> Regards,
> Erwin Moller
It was mainly for bots that seem to refuse to abide by the robots.txt file.
LinkWalker was one that springs to mind - just keeps on taking.
Nel
Navigation:
[Reply to this message]
|