You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@spamassassin.apache.org by da...@chaosreigns.com on 2011/03/23 16:56:25 UTC

Spam Eating Monkey causing 100% false positives for large institutions

In the recent sa-updates, the Spam Eating Monkey rules were
inappropriately enabled.  If you hit them too much, they start returning
100% false positives.  Their listed limits are "more than 100,000 queries
per day or more than 5 queries per second for more than a few minutes".

To disable them, add to your /etc/spamassassin/local.cf:

score RCVD_IN_SEMBLACK    0
score URIBL_SEM           0
score URIBL_SEM_FRESH     0
score URIBL_SEM_FRESH_10  0
score URIBL_SEM_FRESH_15  0
score URIBL_SEM_RED       0


More info:

http://www.spamtips.org/2011/03/sem-rules-mistakenly-enabled-how-to.html
https://issues.apache.org/SpamAssassin/show_bug.cgi?id=6220


Why wasn't this posted to this users list?

I think maybe we should have an announce or alert list for things
like this?

-- 
"The price of freedom is the willingness to do sudden battle, anywhere,
at any time, and with utter recklessness." - Robert A. Heinlein
http://www.ChaosReigns.com

Re: Spam Eating Monkey causing 100% false positives for large institutions

Posted by RW <rw...@googlemail.com>.
On Wed, 23 Mar 2011 11:56:25 -0400
darxus@chaosreigns.com wrote:

> In the recent sa-updates, the Spam Eating Monkey rules were
> inappropriately enabled.  If you hit them too much, they start
> returning 100% false positives. 

On the bright side, anyone getting 100% false positives on any test has
no need of spam filtering.

Re: Spam Eating Monkey causing 100% false positives for large institutions

Posted by da...@chaosreigns.com.
On 03/23, Blaine Fleming wrote:
> As soon as the bug was reported on the dev list I disabled the
> 127.0.0.255 response code to avoid any additional issues.  I will be

That was very kind of you.

> 3AM.  Personally, I don't think it is unreasonable to start returning
> this response code for someone that is performing well over 100M
> queries/day against a free list with a limit of 100K/day.  This policy

I agree.  I don't think SEM did anything inappropriate here.  

What was inappropriate was SA releasing networking rules marked
"#testrules".  Should be interesting to see how that happened.

-- 
"I'd rather be happy than right any day."
- Slartiblartfast, The Hitchhiker's Guide to the Galaxy
http://www.ChaosReigns.com

Re: Spam Eating Monkey causing 100% false positives for large institutions

Posted by Karsten Bräckelmann <gu...@rudersport.de>.
On Wed, 2011-03-23 at 11:08 -1000, Warren Togami Jr. wrote:
> On 3/23/2011 10:58 AM, Karsten Bräckelmann wrote:

> > Ugh, nasty boy. ;)  You do realize they wouldn't be hammering the SEM
> > DNS servers, if testrules wouldn't have slipped out accidentally -- by
> > sa-update.
> >
> > Personally, I'd much rather prefer to have this resolved by another
> > manual rule update, so the queries should die down within another 24-48
> > hours. Obviously, these sites do use sa-update...
> >
> > Thanks and props to Blaine, for effectively disabling the limit
> > temporarily, and sustain the load for a while! :)
> 
> Agreed that would be the ideal solution.  Who knows the procedure?  Is 
> that procedure documented?

Not as much as I would like it to be, but this is documented. See some
of my posts to dev@ the last days...


-- 
char *t="\10pse\0r\0dtu\0.@ghno\x4e\xc8\x79\xf4\xab\x51\x8a\x10\xf4\xf4\xc4";
main(){ char h,m=h=*t++,*x=t+2*h,c,i,l=*x,s=0; for (i=0;i<l;i++){ i%8? c<<=1:
(c=*++x); c&128 && (s+=h); if (!(h>>=1)||!t[s+h]){ putchar(t[s]);h=m;s=0; }}}


Re: Spam Eating Monkey causing 100% false positives for large institutions

Posted by "Warren Togami Jr." <wt...@gmail.com>.
On 3/23/2011 10:58 AM, Karsten Bräckelmann wrote:
> On Wed, 2011-03-23 at 10:18 -1000, Warren Togami Jr. wrote:
>> On 3/23/2011 7:38 AM, Blaine Fleming wrote:
>>>> In the recent sa-updates, the Spam Eating Monkey rules were
>>>> inappropriately enabled.  [...]
>
>>> As soon as the bug was reported on the dev list I disabled the
>>> 127.0.0.255 response code to avoid any additional issues.  I will be
>>> turning this functionality back on as soon as the SA rules are updated
>>> which I assume will be soon.
>>
>> I would recommend blackholing those IP addresses at the firewall of the
>> DNS server, especially those 300 million+ sites that are impossible to
>> contact.  They might finally notice they have a serious configuration
>> issue and stop querying if their mail delivery backs up.
>
> Ugh, nasty boy. ;)  You do realize they wouldn't be hammering the SEM
> DNS servers, if testrules wouldn't have slipped out accidentally -- by
> sa-update.
>
> Personally, I'd much rather prefer to have this resolved by another
> manual rule update, so the queries should die down within another 24-48
> hours. Obviously, these sites do use sa-update...
>
> Thanks and props to Blaine, for effectively disabling the limit
> temporarily, and sustain the load for a while! :)
>
>

Agreed that would be the ideal solution.  Who knows the procedure?  Is 
that procedure documented?

Warren

Re: Spam Eating Monkey causing 100% false positives for large institutions

Posted by Karsten Bräckelmann <gu...@rudersport.de>.
On Wed, 2011-03-23 at 10:18 -1000, Warren Togami Jr. wrote:
> On 3/23/2011 7:38 AM, Blaine Fleming wrote:
> > > In the recent sa-updates, the Spam Eating Monkey rules were
> > > inappropriately enabled.  [...]

> > As soon as the bug was reported on the dev list I disabled the
> > 127.0.0.255 response code to avoid any additional issues.  I will be
> > turning this functionality back on as soon as the SA rules are updated
> > which I assume will be soon.
> 
> I would recommend blackholing those IP addresses at the firewall of the 
> DNS server, especially those 300 million+ sites that are impossible to 
> contact.  They might finally notice they have a serious configuration 
> issue and stop querying if their mail delivery backs up.

Ugh, nasty boy. ;)  You do realize they wouldn't be hammering the SEM
DNS servers, if testrules wouldn't have slipped out accidentally -- by
sa-update.

Personally, I'd much rather prefer to have this resolved by another
manual rule update, so the queries should die down within another 24-48
hours. Obviously, these sites do use sa-update...

Thanks and props to Blaine, for effectively disabling the limit
temporarily, and sustain the load for a while! :)


-- 
char *t="\10pse\0r\0dtu\0.@ghno\x4e\xc8\x79\xf4\xab\x51\x8a\x10\xf4\xf4\xc4";
main(){ char h,m=h=*t++,*x=t+2*h,c,i,l=*x,s=0; for (i=0;i<l;i++){ i%8? c<<=1:
(c=*++x); c&128 && (s+=h); if (!(h>>=1)||!t[s+h]){ putchar(t[s]);h=m;s=0; }}}


Re: Spam Eating Monkey causing 100% false positives for large institutions

Posted by "Warren Togami Jr." <wt...@gmail.com>.
On 3/23/2011 7:38 AM, Blaine Fleming wrote:
> On 3/23/2011 9:56 AM, darxus@chaosreigns.com wrote:
>> In the recent sa-updates, the Spam Eating Monkey rules were
>> inappropriately enabled.  If you hit them too much, they start returning
>> 100% false positives.  Their listed limits are "more than 100,000 queries
>> per day or more than 5 queries per second for more than a few minutes".
>
> As soon as the bug was reported on the dev list I disabled the
> 127.0.0.255 response code to avoid any additional issues.  I will be
> turning this functionality back on as soon as the SA rules are updated
> which I assume will be soon.

I would recommend blackholing those IP addresses at the firewall of the 
DNS server, especially those 300 million+ sites that are impossible to 
contact.  They might finally notice they have a serious configuration 
issue and stop querying if their mail delivery backs up.

Warren


Re: Spam Eating Monkey causing 100% false positives for large institutions

Posted by Blaine Fleming <gr...@digital-z.com>.
On 3/23/2011 9:56 AM, darxus@chaosreigns.com wrote:
> In the recent sa-updates, the Spam Eating Monkey rules were
> inappropriately enabled.  If you hit them too much, they start returning
> 100% false positives.  Their listed limits are "more than 100,000 queries
> per day or more than 5 queries per second for more than a few minutes".

As soon as the bug was reported on the dev list I disabled the
127.0.0.255 response code to avoid any additional issues.  I will be
turning this functionality back on as soon as the SA rules are updated
which I assume will be soon.

The response code of 127.0.0.255 only happens when someone has performed
at least 100 million queries per day for 48 hours straight.  During the
first 48 hours the queries are simply ignored.  Attempts were also made
to contact several of the large (300M+) query sources but so far only
one has responded with anything more than an autoresponder.  Turns out
that even large companies don't watch their systems close enough to
notice long delays and query failures against a blacklist.

If this had been a planned action then policies would have been changed
to reflect the nature of most SA users in regards to default blacklists.
 Unfortunately, the substantial traffic was just dropped on SEM and the
automatic policies did what they are designed to do: They protected the
system.  The result was another very stressed SEM admin calling me at
3AM.  Personally, I don't think it is unreasonable to start returning
this response code for someone that is performing well over 100M
queries/day against a free list with a limit of 100K/day.  This policy
would most likely change if SEM rules were ever part of the default SA
rules.

--Blaine