|
Posted by Nel on 12/18/00 11:51
I have a web site that seems to be getting hammered by lots of bots.
Some genuine, some questionable. Some just users that seem to be
downloading the site en-masse. I have started using php & mysql to track
the IP, host, browser, proxy of all visitors and am trying to gauge how many
pages to serve within one hour before restricting access.
What do you consider to be a reasonable number of visits per hour? Do many
bots rush through a site?
I know this is a vague question, but at least you guys may have some
experience in this area. Before you say it I have tried to Google for
this - no luck at all.
Is there a better way to block abuse of a site???
Nel.
Navigation:
[Reply to this message]
|