|
Posted by Nikita the Spider on 10/11/06 18:00
In article
<doraymeRidThis-6906C5.14543011102006@news-vip.optusnet.com.au>,
dorayme <doraymeRidThis@optusnet.com.au> wrote:
> In article
> <NikitaTheSpider-D7E84C.23275410102006@news-rdr-02-ge0-1.southeas
> t.rr.com>,
> Nikita the Spider <NikitaTheSpider@gmail.com> wrote:
> > I've set up several spamtrap addresses to study this. Eventually I'll
> > write a short article about my findings, but in the meantime I'll
> > summarize here. I have three email addresses all on the same page. One
> > is naked (i.e. just foo@example.com), one is entity encoded (i.e.
> > foo etc.) and one is added to the page by Javascript.
> > The number of spams each has gotten to date is as follows:
> >
> > naked - 715
> > entities - 2
> > javascript - 1
> >
> > In short, the entities look pretty effective to me. They're nice because
> > they don't disturb one's visitors at all and you don't have to mess
> > around with any Javascript.
> >
> It would be nice to actually know how the 2 and 1 got through...
One of the two was a standard 419 scam (see http://www.419eater.com/ if
you're not familiar with these) so I could believe that an actual human
clicked on the link. But they one that got through to both the
Javascript- and entity-protected one was a garden variety spam. It
really surprises me that I got only one. I figured that once I was on
the list, the floodgates would open.
> But, this is not always acceptable. I have no idea how the robots
> work, how clever they are, whether they in fact look at source or
> output or both.
I'd be surprised if any do more than look through the source.
> Your stats would be more meaningful if you could
> say more about the implementation. Interesting experiment though,
> Spider. Look forward to your article.
Thanks, will explain methodology, implementation, etc. and post a link
to the article here eventually.
--
Philip
http://NikitaTheSpider.com/
Whole-site HTML validation, link checking and more
Navigation:
[Reply to this message]
|