|
Posted by Toby Inkster on 01/18/07 09:59
J.O. Aho wrote:
> Why go for a such much work when you can easily protect a subdirectory and
> it's subdirectories from unauthorized people and setting a robot rules to not
> log the subdirectory (most of the serious ones does respect that).
If the directory is password protected properly (i.e. HTTP authentication,
or a decent server-side script), then there's no need to set up any
robots.txt rules to block robots from indexing, as they simply won't be
able to access the site anyway.
Indeed, listing the site in robots.txt will *decrease* the security, as it
alerts people (well, people who look at robots.txt anyway) that the
directory exists. Otherwise they might never have found it. Once they know
*where* it is, they can start trying to guess userids and passwords.
--
Toby A Inkster BSc (Hons) ARCS
Contact Me ~ http://tobyinkster.co.uk/contact
[Back to original message]
|