You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@nutch.apache.org by "Andrzej Bialecki (JIRA)" <ji...@apache.org> on 2006/07/20 00:06:15 UTC

[jira] Closed: (NUTCH-293) support for Crawl-delay in Robots.txt

     [ http://issues.apache.org/jira/browse/NUTCH-293?page=all ]

Andrzej Bialecki  closed NUTCH-293.
-----------------------------------

    Fix Version/s: 0.8-dev
       Resolution: Fixed

Patch applied with minor changes. Thank you!

> support for Crawl-delay in Robots.txt
> -------------------------------------
>
>                 Key: NUTCH-293
>                 URL: http://issues.apache.org/jira/browse/NUTCH-293
>             Project: Nutch
>          Issue Type: Improvement
>          Components: fetcher
>    Affects Versions: 0.8-dev
>            Reporter: Stefan Groschupf
>            Priority: Critical
>             Fix For: 0.8-dev
>
>         Attachments: crawlDelayv1.patch
>
>
> Nutch need support for Crawl-delay defined in robots.txt, it is not a standard but a de-facto standard.
> See:
> http://help.yahoo.com/help/us/ysearch/slurp/slurp-03.html
> Webmasters start blocking nutch since we do not support it.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira