You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@spamassassin.apache.org by Asif Iqbal <iq...@qwestip.net> on 2004/08/03 13:47:35 UTC

Why only 2.3?

Hi All

For the following spam why am I getting 2.3? This is my report

Content analysis details:   (2.3 points, 5.0 required)

 pts rule name              description
---- ----------------------
--------------------------------------------------
 0.1 RATWR10_MESSID         Message-ID has ratware pattern (HEXHEX.HEXHEX@)
 1.7 SARE_RECV_IP_218079    Spam passed through possible spammer relay or source
 0.1 HTML_MESSAGE           BODY: HTML included in message
 0.0 BAYES_50               BODY: Bayesian spam probability is 50 to 56% [score: 0.5004]
 0.3 MIME_HTML_ONLY         BODY: Message only has text/html MIME parts
 0.1 HTML_50_60             BODY: Message is 50% to 60% HTML

I am using SA 2.63 with Rules_Du_Jour and I have spamcop_uri.cf in my
/etc/mail/spamassassin dir. I did install it right. Could it be possible
that my spamcop_uri is not working?

Any help/suggestion would be greatly appreciated

----- Forwarded message from neal field <br...@allsearchengines.co.uk> -----

Return-Path: <br...@allsearchengines.co.uk>
Delivered-To: iqbala@qwestip.net
Received: (qmail 16523 invoked by uid 7801); 3 Aug 2004 09:05:46 -0000
Received: from brawosser@allsearchengines.co.uk by qmail by uid 7791 with qmail-scanner-1.22-st-qms 
 (clamdscan: 0.74. spamassassin: 2.63.  Clear:RC:0(218.79.134.181):SA:0(4.0/5.0):. 
 Processed in 5.128478 secs); 03 Aug 2004 09:05:46 -0000
X-Spam-Status: No, hits=4.0 required=5.0
X-Spam-Level: ++++
Received: from NO?REVERSE?DNS (HELO allsearchengines.co.uk) ([218.79.134.181]) (envelope-sender <br...@allsearchengines.co.uk>)
          by qmail.qwestip.net (qmail-ldap-1.03) with SMTP
          for <iq...@qwestip.net>; 3 Aug 2004 09:05:41 -0000
Message-ID: <38...@allsearchengines.co.uk>
Date: Mon, 02 Aug 2004 23:54:21 -0700
From: "neal field" <br...@allsearchengines.co.uk>
User-Agent: The Bat! (v2.00.4) Personal
X-Accept-Language: en-us
MIME-Version: 1.0
To: "nickolas hollifield" <iq...@qwestip.net>
Cc: "royce held" <ns...@qwestip.net>,
	"roosevelt zelenski" <ns...@qwestip.net>,
	"art rochester" <ho...@qwestip.net>,
	"dino lanfair" <ia...@qwestip.net>,
	"donnell laurenti" <ju...@qwestip.net>,
	"everette mclain" <ro...@qwestip.net>,
	"valentin gale" <jk...@qwestip.net>,
	"scottie campellone" <ma...@qwestip.net>
Subject:  Sub:  you will not be along while you still look like in your 30s
Content-Type: text/html;
	charset="us-ascii"
Content-Transfer-Encoding: 7bit


   toegangsprys,card-ishpynzenyktatarize`



   amer^`i~ca d^~r--ugs & 0ve^rni^ght ship~'pi-ng
   [1]You will be pleasantly surprised if you click the square



   So the crew trooped forth, rather fearfully, and began to reef the
   sails and put the ship into condition to weather the stormHe stretched
   himself upon the ground beneath the dense foliage of the biggest tree
   and abandoned himself to the languor that was creeping over him

   ----- Original Message -----

   From: [2]alejandro colucci

   To: [3]'Karyl Ortiz' ; [4]'Lavonda Carr' ; [5]'Wendolyn Allen' ;
   [6]'Darby Sanders' ; [7]'Kyoko Rivera'

   Sent: Monday, July, 2004 8:46 PM

   Subject: you will not be along while you still look like in your 30s

   Although the approach to the management of patients with and without
   neutropenia is separated in clinical practice this similarity supports
   a biological basis underlying our results
   sural10palmatoria70cacharro,recriador palomeri`a

References

   1. http://www.bigpharmaworld.us/
   2. mailto:bqxdxb@vdv.com
   3. mailto:ntmb@webtv.net
   4. mailto:kwc@americanprint.com
   5. mailto:73750.1571@compuserve.com
   6. mailto:spellcheck@webster.pedantry.com
   7. mailto:roadkill@sprint.ca

----- End forwarded message -----

-- 
Asif Iqbal
PGP Key: 0xE62693C5 KeyServer: pgp.mit.edu
There's no place like 127.0.0.1

Re: Why only 2.3?

Posted by Jeff Chan <je...@surbl.org>.
On Tuesday, August 3, 2004, 4:58:13 AM, Duncan Hill wrote:
> On Tuesday 03 August 2004 12:47, Asif Iqbal might have typed:
>> Hi All
>>
>> For the following spam why am I getting 2.3? This is my report
>>
>> I am using SA 2.63 with Rules_Du_Jour and I have spamcop_uri.cf in my
>> /etc/mail/spamassassin dir. I did install it right. Could it be possible
>> that my spamcop_uri is not working?

>>    1. http://www.bigpharmaworld.us/

> Not found in SC, WS or OB.  Spamcop it to get it into SC.

FWIW it is in WS now:

  bigpharmaworld.us.ws.surbl.org.  1h30m IN A  127.0.0.2

Definitely report spam to SpamCop too.  :-)

Jeff C.
-- 
Jeff Chan
mailto:jeffc@surbl.org
http://www.surbl.org/


Re: Why only 2.3?

Posted by Duncan Hill <sa...@nacnud.force9.co.uk>.
On Tuesday 03 August 2004 12:47, Asif Iqbal might have typed:
> Hi All
>
> For the following spam why am I getting 2.3? This is my report
>
> I am using SA 2.63 with Rules_Du_Jour and I have spamcop_uri.cf in my
> /etc/mail/spamassassin dir. I did install it right. Could it be possible
> that my spamcop_uri is not working?

>    1. http://www.bigpharmaworld.us/

Not found in SC, WS or OB.  Spamcop it to get it into SC.

Re: Why only 2.3?

Posted by Jeff Chan <je...@surbl.org>.
On Tuesday, August 3, 2004, 5:50:33 AM, Asif Iqbal wrote:
> Yes, I am. And thats what I exactly suspected is that my spamcop_uri is
> not working. Not sure why. Is there a quick way to check where it is
> failing ?

Please send yourself a message with:

  http://surbl-org-permanent-test-point-MUNGED.com/

or:

  http://127.0.0.2-MUNGED/

without the -MUNGED.  You should see those messages
hit on all the SURBLs you have installed.

  http://www.surbl.org/faq.html#test-uris

Also make sure you have network tests enabled, e.g., no -L flag
on spamd.

Jeff C.
-- 
Jeff Chan
mailto:jeffc@surbl.org
http://www.surbl.org/


Re: Why only 2.3?

Posted by Duncan Hill <sa...@nacnud.force9.co.uk>.
On Tuesday 03 August 2004 15:14, Scot L. Harris might have typed:
> On Tue, 2004-08-03 at 09:59, Duncan Hill wrote:

> > Yep, and that's one reason I greylist :)  Not much actually reaches my
> > greylisting db due to other restrictions in place, but it does work
> > effectively.
>
> And using a greylisting delay of only a few minutes currently eliminates
> 98 to 99% of the spam out there.  Of course in this case a longer delay
> would be needed to let the surbl lists update with new spam.  And
> if/when the spammers try to get around it the combination of greylisting
> and using spamassassin with black lists will take care of the problem.

Yes.  I've had to whitelist my spamtraps ahead of all of my protections due to 
the spamtraps not getting spam.  Sort of defeats the purpose of a spam trap.

Speaking of, is there a way to make maildrop forward a mail in the appropriate 
format for spamcop?

Re: Why only 2.3?

Posted by "Scot L. Harris" <we...@cfl.rr.com>.
On Tue, 2004-08-03 at 09:59, Duncan Hill wrote:
> On Tuesday 03 August 2004 14:53, Bob Apthorpe might have typed:
> 
> > On Tue, 3 Aug 2004 14:32:59 +0100 Duncan Hill <sa...@nacnud.force9.co.uk> 
> > > It took about 1/2 an hour after Jeff posted that it was in WS before the
> > > local DNS managed to find a surbl.org DNS that had the record:
> >
> > [...]
> >
> > Note that this is where the delay of greylisting is supposed to help.
> > While a large fraction of spam-emitting hosts won't retry, a delay of
> > 30-60min gives the DNSBLs time to update from trap data. That's the
> > theory anyway.
> 
> Yep, and that's one reason I greylist :)  Not much actually reaches my 
> greylisting db due to other restrictions in place, but it does work 
> effectively.

And using a greylisting delay of only a few minutes currently eliminates
98 to 99% of the spam out there.  Of course in this case a longer delay
would be needed to let the surbl lists update with new spam.  And
if/when the spammers try to get around it the combination of greylisting
and using spamassassin with black lists will take care of the problem.

IMHO if everyone used this combination spam would almost become extinct
and since very few people would be getting the messages I would expect
spam to start going away entirely.  At least until they find another
cheap alternative.

-- 
Scot L. Harris
webid@cfl.rr.com

Repartee is something we think of twenty-four hours too late.
		-- Mark Twain 


Re: Why only 2.3?

Posted by Duncan Hill <sa...@nacnud.force9.co.uk>.
On Tuesday 03 August 2004 14:53, Bob Apthorpe might have typed:

> On Tue, 3 Aug 2004 14:32:59 +0100 Duncan Hill <sa...@nacnud.force9.co.uk> 
> > It took about 1/2 an hour after Jeff posted that it was in WS before the
> > local DNS managed to find a surbl.org DNS that had the record:
>
> [...]
>
> Note that this is where the delay of greylisting is supposed to help.
> While a large fraction of spam-emitting hosts won't retry, a delay of
> 30-60min gives the DNSBLs time to update from trap data. That's the
> theory anyway.

Yep, and that's one reason I greylist :)  Not much actually reaches my 
greylisting db due to other restrictions in place, but it does work 
effectively.

Re: Why only 2.3?

Posted by Jeff Chan <je...@surbl.org>.
On Tuesday, August 3, 2004, 6:32:59 AM, Duncan Hill wrote:
> On Tuesday 03 August 2004 13:50, Asif Iqbal might have typed:

>> Yes, I am. And thats what I exactly suspected is that my spamcop_uri is
>> not working. Not sure why. Is there a quick way to check where it is
>> failing ?
>>
>> >   http://www.surbl.org/quickstart.html

> It took about 1/2 an hour after Jeff posted that it was in WS before the local 
> DNS managed to find a surbl.org DNS that had the record:

> 12:56:40 dhill@flyingfish
~>> dig bigpharmaworld.us.ws.surbl.org
> No answer.

> 13:18:22 dhill@flyingfish
~>> dig bigpharmaworld.us.ws.surbl.org
> bigpharmaworld.us.ws.surbl.org. 5400 IN A       127.0.0.2

> Feed the same message through SA again and see if WS catches it.

Yes, we're looking at lowering the TTLs across the board
on all SURBLs now.

I'm in a conversation on the rbldnsd list about negative
and positive caching, some effects on name server load,
etc.  It's quite interesting if somewhat esoteric.

Jeff C.
-- 
Jeff Chan
mailto:jeffc@surbl.org
http://www.surbl.org/


Re: Why only 2.3?

Posted by Bob Apthorpe <ap...@cynistar.net>.
Hi,

On Tue, 3 Aug 2004 14:32:59 +0100 Duncan Hill <sa...@nacnud.force9.co.uk> wrote:

> On Tuesday 03 August 2004 13:50, Asif Iqbal might have typed:
> 
> > Yes, I am. And thats what I exactly suspected is that my spamcop_uri is
> > not working. Not sure why. Is there a quick way to check where it is
> > failing ?
> >
> > >   http://www.surbl.org/quickstart.html
> 
> It took about 1/2 an hour after Jeff posted that it was in WS before the local 
> DNS managed to find a surbl.org DNS that had the record:
[...]

Note that this is where the delay of greylisting is supposed to help.
While a large fraction of spam-emitting hosts won't retry, a delay of
30-60min gives the DNSBLs time to update from trap data. That's the
theory anyway.

-- Bob

Re: Why only 2.3?

Posted by Asif Iqbal <iq...@qwestip.net>.
On Tue, Aug 03, 2004 at 02:32:59PM +0100, Duncan Hill wrote:
> On Tuesday 03 August 2004 13:50, Asif Iqbal might have typed:
> 
> > Yes, I am. And thats what I exactly suspected is that my spamcop_uri is
> > not working. Not sure why. Is there a quick way to check where it is
> > failing ?
> >
> > >   http://www.surbl.org/quickstart.html
> 
> It took about 1/2 an hour after Jeff posted that it was in WS before the local 
> DNS managed to find a surbl.org DNS that had the record:

You are right.. Now it thinks it is an spam

Content analysis details:   (8.4 points, 5.0 required)

 pts rule name              description
---- ----------------------
--------------------------------------------------
 0.1 RATWR10_MESSID         Message-ID has ratware pattern (HEXHEX.HEXHEX@)
 1.7 SARE_RECV_IP_218079    Spam passed through possible spammer relay or source
 0.1 HTML_MESSAGE           BODY: HTML included in message
-0.9 BAYES_10               BODY: Bayesian spam probability is 10 to 20% [score: 0.1115]
 0.3 MIME_HTML_ONLY         BODY: Message only has text/html MIME parts
 0.1 HTML_50_60             BODY: Message is 50% to 60% HTML
 4.0 OB_URI_RBL             URI's domain appears in ob.surbl.org [www.bigpharmaworld.us is blacklisted in URI RBL] [at ob.surbl.org]
 3.0 WS_URI_RBL             URI's domain appears in sa-blacklist [www.bigpharmaworld.us is blacklisted in URI RBL] [at ws.surbl.org]


> 
> 12:56:40 dhill@flyingfish
> ~> dig bigpharmaworld.us.ws.surbl.org
> No answer.
> 
> 13:18:22 dhill@flyingfish
> ~> dig bigpharmaworld.us.ws.surbl.org
> bigpharmaworld.us.ws.surbl.org. 5400 IN A       127.0.0.2
> 
> Feed the same message through SA again and see if WS catches it.

-- 
Asif Iqbal
PGP Key: 0xE62693C5 KeyServer: pgp.mit.edu
There's no place like 127.0.0.1

Re: Why only 2.3?

Posted by Duncan Hill <sa...@nacnud.force9.co.uk>.
On Tuesday 03 August 2004 13:50, Asif Iqbal might have typed:

> Yes, I am. And thats what I exactly suspected is that my spamcop_uri is
> not working. Not sure why. Is there a quick way to check where it is
> failing ?
>
> >   http://www.surbl.org/quickstart.html

It took about 1/2 an hour after Jeff posted that it was in WS before the local 
DNS managed to find a surbl.org DNS that had the record:

12:56:40 dhill@flyingfish
~> dig bigpharmaworld.us.ws.surbl.org
No answer.

13:18:22 dhill@flyingfish
~> dig bigpharmaworld.us.ws.surbl.org
bigpharmaworld.us.ws.surbl.org. 5400 IN A       127.0.0.2

Feed the same message through SA again and see if WS catches it.

Re: Why only 2.3?

Posted by Asif Iqbal <iq...@qwestip.net>.
On Tue, Aug 03, 2004 at 04:59:13AM -0700, Jeff Chan wrote:
> On Tuesday, August 3, 2004, 4:47:35 AM, Asif Iqbal wrote:
> > Hi All
> 
> > For the following spam why am I getting 2.3? This is my report
> 
> > Content analysis details:   (2.3 points, 5.0 required)
> 
> >  pts rule name              description
> > ---- ----------------------
> > --------------------------------------------------
> >  0.1 RATWR10_MESSID         Message-ID has ratware pattern (HEXHEX.HEXHEX@)
> >  1.7 SARE_RECV_IP_218079    Spam passed through possible spammer relay or source
> >  0.1 HTML_MESSAGE           BODY: HTML included in message
> >  0.0 BAYES_50               BODY: Bayesian spam probability is 50 to 56% [score: 0.5004]
> >  0.3 MIME_HTML_ONLY         BODY: Message only has text/html MIME parts
> >  0.1 HTML_50_60             BODY: Message is 50% to 60% HTML
> 
> > I am using SA 2.63 with Rules_Du_Jour and I have spamcop_uri.cf in my
> > /etc/mail/spamassassin dir. I did install it right. Could it be possible
> > that my spamcop_uri is not working?
> 
> bigpharmaworld.us is (only) in the ws.surbl.org list, but are you
> using WS?  The rule and score would look like this:

Yes, I am. And thats what I exactly suspected is that my spamcop_uri is
not working. Not sure why. Is there a quick way to check where it is
failing ?

> 
>   http://www.surbl.org/quickstart.html

I did follow this instruction while installing surbl

I installed the SpamCopURI plugin for SA 2.63, restarted spamd. I also
have Net::DNS

-- 
Asif Iqbal
PGP Key: 0xE62693C5 KeyServer: pgp.mit.edu
There's no place like 127.0.0.1

Re: Why only 2.3?

Posted by Jeff Chan <je...@surbl.org>.
On Tuesday, August 3, 2004, 4:47:35 AM, Asif Iqbal wrote:
> Hi All

> For the following spam why am I getting 2.3? This is my report

> Content analysis details:   (2.3 points, 5.0 required)

>  pts rule name              description
> ---- ----------------------
> --------------------------------------------------
>  0.1 RATWR10_MESSID         Message-ID has ratware pattern (HEXHEX.HEXHEX@)
>  1.7 SARE_RECV_IP_218079    Spam passed through possible spammer relay or source
>  0.1 HTML_MESSAGE           BODY: HTML included in message
>  0.0 BAYES_50               BODY: Bayesian spam probability is 50 to 56% [score: 0.5004]
>  0.3 MIME_HTML_ONLY         BODY: Message only has text/html MIME parts
>  0.1 HTML_50_60             BODY: Message is 50% to 60% HTML

> I am using SA 2.63 with Rules_Du_Jour and I have spamcop_uri.cf in my
> /etc/mail/spamassassin dir. I did install it right. Could it be possible
> that my spamcop_uri is not working?

bigpharmaworld.us is (only) in the ws.surbl.org list, but are you
using WS?  The rule and score would look like this:

  http://www.surbl.org/quickstart.html

uri       WS_URI_RBL  eval:check_spamcop_uri_rbl('ws.surbl.org','127.0.0.2')
describe  WS_URI_RBL  URI's domain appears in sa-blacklist
tflags    WS_URI_RBL  net

score     WS_URI_RBL  3.0

Jeff C.
-- 
Jeff Chan
mailto:jeffc@surbl.org
http://www.surbl.org/


Re: Why only 2.3?

Posted by Jeff Chan <je...@surbl.org>.
On Tuesday, August 3, 2004, 3:00:05 PM, Daniel Quinlan wrote:
> Bret Miller <br...@wcg.org> writes:

>> Actually, report is exactly what he should do. Report to SpamCop. Report
>> by submitting it for ws.surbl.org. Report to SARE if it seems like a
>> rule could be made for it. But not report to this list.

> I meant "report to the users mailing list".  He's posted *10* messages
> about missed spams.  If everyone did this, the list would be completely
> unusable.

Especially when solutions to catch those already exist within
the SA framework. 

FWIW I don't mind hearing about *truly new* forms of spam since
we probably want to be able to catch those.  But these
aren't them....

Jeff C.
-- 
Jeff Chan
mailto:jeffc@surbl.org
http://www.surbl.org/


RE: Why only 2.3?

Posted by Daniel Quinlan <qu...@pathname.com>.
Bret Miller <br...@wcg.org> writes:

> Actually, report is exactly what he should do. Report to SpamCop. Report
> by submitting it for ws.surbl.org. Report to SARE if it seems like a
> rule could be made for it. But not report to this list.

I meant "report to the users mailing list".  He's posted *10* messages
about missed spams.  If everyone did this, the list would be completely
unusable.

------- start of cut text --------------
Subject: This one slipped through
Subject: SPAM undetected
Subject: Another SPAM went undetected
Subject: Spams not detected
Subject: Should be a spam
Subject: Undetectable SPAM
Subject: Here is another SPAM
Subject: [bckoontz@qwestinternet.net: FW: The Power Of The Internet..]
Subject: [syec.support.department@sharingexperiences.us: A user is looking for opinions about: iqbala@qwestip.net]
Subject: Why only 2.3?
------- end ----------------------------

Daniel

-- 
Daniel Quinlan
http://www.pathname.com/~quinlan/

Re: Why only 2.3?

Posted by Steve Bertrand <ia...@ibctech.ca>.
> On Tue, Aug 03, 2004 at 01:10:26PM -0700, Daniel Quinlan wrote:
>> Asif Iqbal <iq...@qwestip.net> writes:
>>
>> > Hi All
>> >
>> > For the following spam why am I getting 2.3? This is my report
>>
>> Dude, are you going to report every single missed spam you get?
>> Missed
>
> My intention was not to report an spam. Rather if I can get some help
> help to detect spam like that. Totally different intention that how
> you
> are labelling it
>
> I only expect an help on how to detect spam like that, or why my setup
> did not detect that spam and if there is any suggestion out here in
> the
> mailing list.
>
> Are you saying I am NOT suppose to ask for help by showing a sample
> spam
> that I missed to detect? Is there another mailing list where I can get
> help related to spams that my SA slipped to detect? If yes I would
> happily redirect of emails of that nature to that list
>
> Hope I made my intention clear.
>
> You replied twice to my email to made your point. No wonder there are
> 920
> msgs just from you stored now in the internet :-)
>
> Thanks anyway

Wow, relax man!!

I'm no SA expert, nor do I claim to be, but in honesty, I find that by
paying attention to this list as a lurker, you can find valuable
information.

At the beginning, if I was receiving new spam (new, to me), I would
wander over to http://rulesemporium.com and look through the rulesets
briefly to see if any of them looked like they would catch this _new_
spam.

After time, I got familiar with what each ruleset did, and wouldn't
do, and by paying close attention to what rules are hitting/missing,
you can take individual rules and disable the ones you don't need.

Every day I get at least one spam through that *looks* like it should
hit certain rules, but if it doesn't, don't fret. Just put it in a
Spam folder, and then let salearn have a crack at it.

I find that SURBL's take a good majority off the top, after that, it's
tweaking with rulesets. Make sure you check frequently for updated
rulesets that may catch that new one that just *sneaks* by.

I don't think it's anyone's intention to flame you, but the guidelines
are clearly laid out in the Wiki as someone else suggested.

Perhaps if you have this problem in the future, you could post a
little of the troubleshooting you have done yourself, and also about
what rules you _thought_ should of hit, then there would be a more
legitimate reason for posting...ie:

[example]
SA did not catch this spam with the subject: [insert subject]. I
assumed that rules x, xy, and z should of hit, but only one of them
did. I found that some URI rules should of hit as well, and they
didn't. I have tested my DNS on my server and it works ok...
[/example]

Regards,

Steve

>
>> spam happens, it's not a bug.  The only way to catch all spam is to
>> send everything, ham and spam, to /dev/null.
>>
>> Please read the FAQ on the wiki:
>>
>>   http://wiki.apache.org/spamassassin/DoYouWantMySpam
>>
>> Daniel
>>
>> --
>> Daniel Quinlan
>> http://www.pathname.com/~quinlan/
>
> --
> Asif Iqbal
> PGP Key: 0xE62693C5 KeyServer: pgp.mit.edu
> There's no place like 127.0.0.1
>



Re: Why only 2.3?

Posted by Asif Iqbal <iq...@qwestip.net>.
On Tue, Aug 03, 2004 at 01:10:26PM -0700, Daniel Quinlan wrote:
> Asif Iqbal <iq...@qwestip.net> writes:
> 
> > Hi All
> > 
> > For the following spam why am I getting 2.3? This is my report
> 
> Dude, are you going to report every single missed spam you get?  Missed

My intention was not to report an spam. Rather if I can get some help
help to detect spam like that. Totally different intention that how you
are labelling it

I only expect an help on how to detect spam like that, or why my setup
did not detect that spam and if there is any suggestion out here in the
mailing list.

Are you saying I am NOT suppose to ask for help by showing a sample spam
that I missed to detect? Is there another mailing list where I can get
help related to spams that my SA slipped to detect? If yes I would
happily redirect of emails of that nature to that list

Hope I made my intention clear.

You replied twice to my email to made your point. No wonder there are 920
msgs just from you stored now in the internet :-)

Thanks anyway

> spam happens, it's not a bug.  The only way to catch all spam is to
> send everything, ham and spam, to /dev/null.
> 
> Please read the FAQ on the wiki:
> 
>   http://wiki.apache.org/spamassassin/DoYouWantMySpam
> 
> Daniel
> 
> -- 
> Daniel Quinlan
> http://www.pathname.com/~quinlan/

-- 
Asif Iqbal
PGP Key: 0xE62693C5 KeyServer: pgp.mit.edu
There's no place like 127.0.0.1

RE: Why only 2.3?

Posted by Bret Miller <br...@wcg.org>.
> > Hi All
> > 
> > For the following spam why am I getting 2.3? This is my report
> 
> Dude, are you going to report every single missed spam you 
> get?  Missed
> spam happens, it's not a bug.  The only way to catch all spam is to
> send everything, ham and spam, to /dev/null.
> 
> Please read the FAQ on the wiki:
> 
>   http://wiki.apache.org/spamassassin/DoYouWantMySpam


Actually, report is exactly what he should do. Report to SpamCop. Report
by submitting it for ws.surbl.org. Report to SARE if it seems like a
rule could be made for it. But not report to this list.

Bret
----------
 
Send your spam to: bretmiller@wcg.org
Thanks for keeping the internet spam-free!
 




Re: Why only 2.3?

Posted by Daniel Quinlan <qu...@pathname.com>.
Asif Iqbal <iq...@qwestip.net> writes:

> Hi All
> 
> For the following spam why am I getting 2.3? This is my report

Dude, are you going to report every single missed spam you get?  Missed
spam happens, it's not a bug.  The only way to catch all spam is to
send everything, ham and spam, to /dev/null.

Please read the FAQ on the wiki:

  http://wiki.apache.org/spamassassin/DoYouWantMySpam

Daniel

-- 
Daniel Quinlan
http://www.pathname.com/~quinlan/