|
Posted by Phil Earnhardt on 02/07/06 21:07
On Tue, 07 Feb 2006 12:13:22 +0100, "Barbara de Zoete"
<trashbin@pretletters.net> wrote:
>On Tue, 07 Feb 2006 12:07:28 +0100, ray <datasmog@hotmail.com> wrote:
>
>> There are applications that enable you to download entire static
>> websites without actually visiting them. WebGrabber and SiteSucker come
>> to mind.
>
>I very much dislike to see someone using those applications on my site.
>Downloading over a hundred pages and all that comes with them in a short
>time, for what? I'm sure the one who does that is not going to read all
>the stuff that just got downloaded, which means it is just a waste of
>bandwidth.
>If I spot the same IP doing that more than once (yup, there are those
>people) or if I notice that it is a commercial enterprise that does that,
>the IP gets banned. I wish there was a way to block these grabbers
>altogether.
I can't imagine how you would categorically block them. OTOH, the
Robots Exclusion Protocol can be used to tell anyone who honors such
things that you don't want your website copied.
--phil
Navigation:
[Reply to this message]
|