|
Posted by CAH on 04/04/06 13:40
> I wouldn't expect all (or even most) robots to be easily identified
by
> the user-agent. Maybe you could make an array of the most common ones
> (Googlebot, Inktomi, etc) and loop through it with the logic I
> suggested. I also don't think you could check to see if it's a browser,
> because firewalls & proxy servers may not send that information through.
I see what you mean.
Do you think this solution will work?
"Using .htaccess often, you need to put the following two lines in the
..htaccess file, if your host is using PHP as an Apache module:
php_value session.use_only_cookies 1
php_value session.use_trans_sid 0 "
I think it does, and even though you then have to rely on cookies, I
think it is the better solution because today this is a small minus,
compared to search engine problems.
If this solutions works
User-agent: Googlebot
Disallow: /*PHPSESSID
it would be by far the simplest, I do however not feel to sure that it
does work, and have no opportunity to check it at this time.
Regards
Mads
Navigation:
[Reply to this message]
|