You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spamassassin.apache.org by bu...@bugzilla.spamassassin.org on 2015/04/29 17:40:21 UTC

[Bug 7182] New: SPF records routinely exceed the maximum 10

https://bz.apache.org/SpamAssassin/show_bug.cgi?id=7182

            Bug ID: 7182
           Summary: SPF records routinely exceed the maximum 10
           Product: Spamassassin
           Version: SVN Trunk (Latest Devel Version)
          Hardware: PC
                OS: Windows 7
            Status: NEW
          Severity: normal
          Priority: P2
         Component: Plugins
          Assignee: dev@spamassassin.apache.org
          Reporter: kmcgrail@pccc.com

As discussed in Bug 7112, the SPF RFC, https://tools.ietf.org/html/rfc7208, is
clear that the nested records for SPF are limited to 10.

 SPF implementations MUST limit the number of mechanisms and modifiers
   that do DNS lookups to at most 10 per SPF check, including any
   lookups caused by the use of the "include" mechanism or the
   "redirect" modifier.  If this number is exceeded during a check, a
   PermError MUST be returned.  The "include", "a", "mx", "ptr", and
   "exists" mechanisms as well as the "redirect" modifier do count
   against this limit.  The "all", "ip4", and "ip6" mechanisms do not
   require DNS lookups and therefore do not count against this limit.
   The "exp" modifier does not count against this limit because the DNS
   lookup to fetch the explanation string occurs after the SPF record
   has been evaluated.

This is also a real-world DOS prevention technique that ignoring is wrong.

Large players like Google handle this fine (though they aren't perfect... keep
reading):

dig -t txt _spf.google.com     
_spf.google.com.        199     IN      TXT     "v=spf1
include:_netblocks.google.com include:_netblocks2.google.com
include:_netblocks3.google.com ~all"

dig -t txt _netblocks.google.com   
_netblocks.google.com.  2891    IN      TXT     "v=spf1 ip4:64.18.0.0/20
ip4:64.233.160.0/19 ip4:66.102.0.0/20 ip4:66.249.80.0/20 ip4:72.14.192.0/18
ip4:74.125.0.0/16 ip4:173.194.0.0/16 ip4:207.126.144.0/20 ip4:209.85.128.0/17
ip4:216.58.192.0/19 ip4:216.239.32.0/19 ~all"

dig -t txt _netblocks2.google.com     
_netblocks2.google.com. 3251    IN      TXT     "v=spf1 ip6:2001:4860:4000::/36
ip6:2404:6800:4000::/36 ip6:2607:f8b0:4000::/36 ip6:2800:3f0:4000::/36
ip6:2a00:1450:4000::/36 ip6:2c0f:fb50:4000::/36 ~all"

dig -t txt _netblocks3.google.com 
_netblocks3.google.com. 3249    IN      TXT     "v=spf1 ~all"

Overall, looks good and under the limit of 10.


OTHERS, like ebay have at least 13 lookups which clearly breaks the RFC. From
tests on 4/29/2015:

dig -t txt ebay.com
dig -t txt s._spf.ebay.com
dig -t txt c._spf.ebay.com
dig -t txt p._spf.ebay.com
dig -t txt emarsys.net
dig -t txt _spf.salesforce.com
dig -t txt _mtablock1.salesforce.com
dig -t txt p2._spf.ebay.com
dig -t txt docusign.net
dig -t txt sendgrid.net
dig -t txt cmail1.com
dig -t txt sendgrid.biz
dig -t txt pp._spf.paypal.com


And places like SecureServer/GoDaddy I also got to 13 and stopped going
manually:

dig -t txt smtp.secureserver.net
smtp.secureserver.net.  79508   IN      TXT     "v=spf1
include:spf.secureserver.net -all"
dig -t txt spf.secureserver.net       
spf.secureserver.net.   373     IN      TXT     "v=spf1
include:spf100.secureserver.net include:spf200.secureserver.net -all"
dig -t txt spf100.secureserver.net
spf100.secureserver.net. 368    IN      TXT     "v=spf1
include:spf101.secureserver.net include:spf102.secureserver.net
include:spf103.secureserver.net include:spf104.secureserver.net
include:spf105.secureserver.net include:spf106.secureserver.net
include:spf107.secureserver.net -all"
dig -t txt spf200.secureserver.net   
spf200.secureserver.net. 361    IN      TXT     "v=spf1
include:spf201.secureserver.net include:spf202.secureserver.net
include:spf203.secureserver.net -all"

This is ridiculous and causing real work PERMERRORS in SPF.

SpamAssassin raised our limit to max_dns_interactive_terms => 15,
https://svn.apache.org/viewvc?view=rev&rev=1646363 and already we are seeing
limits bumped for places using GoDaddy/SecureServer that exceed 15.


eBay and GoDaddy should be on top of these things.  I don't have the time to
explain RFCs or shame major companies into following them.  

Additionally, from testing, Gmail clearly doesn't follow the 10 limit and
parses limits at least as high as 16.  That means that they clearly see the
same practical, real-world issue and have raised their limit making them
vulnerable to being used as middle-men in a DOS attack as discussed in the RFC.

Therefore, I am raising the limit from 15 to 20 for trunk and 3.4 branch. 
Additionally, we might want to make this a configurable option and LOWER the
default to 10 to match the RFC with the recommendation that in real world
operations, 16 or even 20 might be necessary.

Will add commits when testing completes.

regards,
KAM

-- 
You are receiving this mail because:
You are the assignee for the bug.

[Bug 7182] SPF records routinely exceed the maximum 10

Posted by bu...@bugzilla.spamassassin.org.
https://bz.apache.org/SpamAssassin/show_bug.cgi?id=7182

John Hardin <jh...@impsec.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |jhardin@impsec.org

--- Comment #1 from John Hardin <jh...@impsec.org> ---
Would per-domain override of the default be possible? I think having exceptions
for bad configs like eBay and SecureServer would be better than opening it up
to potential DoS via *any* domain.

-- 
You are receiving this mail because:
You are the assignee for the bug.

[Bug 7182] SPF records routinely exceed the maximum 10

Posted by bu...@bugzilla.spamassassin.org.
https://bz.apache.org/SpamAssassin/show_bug.cgi?id=7182

--- Comment #6 from John Hardin <jh...@impsec.org> ---
...is this still somehow a blocker for 3.4.2? 

I don't think wanting per-domain configuration of this (given the difficulty of
determining which domains require deeper recursion to work properly) should be
considered a blocker. I don't know that per-domain configuration is even
needed, I was just throwing out an idea.

And the suggestion to make the SPF recursion exposed for scoring, while
potentially useful, probably shouldn't be considered a blocker either.

-- 
You are receiving this mail because:
You are the assignee for the bug.

[Bug 7182] SPF records routinely exceed the maximum 10

Posted by bu...@bugzilla.spamassassin.org.
https://bz.apache.org/SpamAssassin/show_bug.cgi?id=7182

Dave Jones <da...@apache.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |davej@apache.org

--- Comment #8 from Dave Jones <da...@apache.org> ---
+1

I have many customers that are held hostage by 3rd party SPF records that don't
understand they should be conservative with their SPF record DNS lookups.

For example, we have customers that were told to include:spf.pearson.com in
their SPF record so emails could be sent on their behalf.  Yet if you look at
the spf.pearson.com SPF record, it includes _spf.google.com and takes up 6 of
the 10 allowed queries.  I opened up a case with Pearson but after months of
trying to make them understand, they just don't get it so I gave up.

Limiting to 20 lookups is fine to match the real world.  Close this bug.

-- 
You are receiving this mail because:
You are the assignee for the bug.

[Bug 7182] SPF records routinely exceed the maximum 10

Posted by bu...@bugzilla.spamassassin.org.
https://bz.apache.org/SpamAssassin/show_bug.cgi?id=7182

Joe Quinn <jq...@pccc.com> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |jquinn+SAbug@pccc.com

--- Comment #5 from Joe Quinn <jq...@pccc.com> ---
As a good light incentive for writing RFC-compliant SPF records, I would
suggest we write a rule that detects breaking the depth limit.

It should start at 0.001 and increase by perhaps 0.1 per month for two years.
We can start taking a harder stance on this without overly hurting S/O and
those who can't behave in two years will take a 2.4 penalty.

Normally we're not supposed to be RFC enforcers, but there's a degree of
internet neighborliness that I think applies to us, and KAM agrees.

-- 
You are receiving this mail because:
You are the assignee for the bug.

[Bug 7182] SPF records routinely exceed the maximum 10

Posted by bu...@bugzilla.spamassassin.org.
https://bz.apache.org/SpamAssassin/show_bug.cgi?id=7182

Kevin A. McGrail <km...@apache.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
         Resolution|---                         |FIXED
             Status|NEW                         |RESOLVED

--- Comment #9 from Kevin A. McGrail <km...@apache.org> ---
closing at 20

-- 
You are receiving this mail because:
You are the assignee for the bug.

[Bug 7182] SPF records routinely exceed the maximum 10

Posted by bu...@bugzilla.spamassassin.org.
https://bz.apache.org/SpamAssassin/show_bug.cgi?id=7182

Kevin A. McGrail <km...@pccc.com> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
   Target Milestone|Undefined                   |3.4.2
           Severity|normal                      |blocker

--- Comment #4 from Kevin A. McGrail <km...@pccc.com> ---
Upping to a blocker for 3.4.2

-- 
You are receiving this mail because:
You are the assignee for the bug.

[Bug 7182] SPF records routinely exceed the maximum 10

Posted by bu...@bugzilla.spamassassin.org.
https://bz.apache.org/SpamAssassin/show_bug.cgi?id=7182

Kevin A. McGrail <km...@pccc.com> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |kmcgrail@pccc.com

--- Comment #3 from Kevin A. McGrail <km...@pccc.com> ---
(In reply to John Hardin from comment #1)
> Would per-domain override of the default be possible? I think having
> exceptions for bad configs like eBay and SecureServer would be better than
> opening it up to potential DoS via *any* domain.

Well in a quick check, we had over 300 hits on PERMERROR in the past 5 hours. 
Without parsing further, I really predict new issues every day.

For example, we found the secureserver one because we had a subdomain that used
an redirect/include that chained to the domain that had an include for
secureserver, etc. etc. That one extra bump on the subdomain put us over 15.  

So it's not the easiest issue to find though Joe's work to unearth
T_SPF_PERMERROR helps and there is an open bug for SPF more info from that
issue.

So I like your idea but I think it would be a nightmare to use so we would
likely use a higher limit which is in line with other major players ignoring
the RFC like Gmail.  Hopefully, doubling from 10 to 20 isn't dramatically
increasing the DOS potential since the 10+ years since the spec was started.

In conclusion, the idea of a single number for configuration is my choice.  But
hey, if you code up something that does by domain, I might think of a way to
add multiple thresholds or something.

Regards,
KAM

-- 
You are receiving this mail because:
You are the assignee for the bug.

[Bug 7182] SPF records routinely exceed the maximum 10

Posted by bu...@bugzilla.spamassassin.org.
https://bz.apache.org/SpamAssassin/show_bug.cgi?id=7182

--- Comment #2 from Kevin A. McGrail <km...@pccc.com> ---
svn commit -m 'Trunk Commit of Bug 7182 for raise from 15 to 20'
Sending        lib/Mail/SpamAssassin/Plugin/SPF.pm
Transmitting file data .
Committed revision 1676785.

svn commit -m '3.4 branch Commit of Bug 7182 for raise from 15 to 20'
Sending        lib/Mail/SpamAssassin/Plugin/SPF.pm
Transmitting file data .
Committed revision 1676787.

-- 
You are receiving this mail because:
You are the assignee for the bug.

[Bug 7182] SPF records routinely exceed the maximum 10

Posted by bu...@bugzilla.spamassassin.org.
https://bz.apache.org/SpamAssassin/show_bug.cgi?id=7182

Bill Cole <sa...@billmail.scconsult.com> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |sa-bugz-20080315@billmail.s
                   |                            |cconsult.com

--- Comment #7 from Bill Cole <sa...@billmail.scconsult.com> ---
(In reply to John Hardin from comment #6)
> ...is this still somehow a blocker for 3.4.2? 
> 
> I don't think wanting per-domain configuration of this (given the difficulty
> of determining which domains require deeper recursion to work properly)
> should be considered a blocker. I don't know that per-domain configuration
> is even needed, I was just throwing out an idea.
> 
> And the suggestion to make the SPF recursion exposed for scoring, while
> potentially useful, probably shouldn't be considered a blocker either.

+1

Remaining issues seem like substantive new functionality, not bugs.

-- 
You are receiving this mail because:
You are the assignee for the bug.