You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@nutch.apache.org by "Julien Nioche (JIRA)" <ji...@apache.org> on 2011/07/06 22:35:16 UTC

[jira] [Resolved] (NUTCH-1032) Delegate parsing of robots.txt to crawler-commons

     [ https://issues.apache.org/jira/browse/NUTCH-1032?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Julien Nioche resolved NUTCH-1032.
----------------------------------

    Resolution: Duplicate

Issue created twice for some reason. Duplicate of https://issues.apache.org/jira/browse/NUTCH-1031

> Delegate parsing of robots.txt to crawler-commons
> -------------------------------------------------
>
>                 Key: NUTCH-1032
>                 URL: https://issues.apache.org/jira/browse/NUTCH-1032
>             Project: Nutch
>          Issue Type: Task
>            Reporter: Julien Nioche
>            Assignee: Julien Nioche
>            Priority: Minor
>              Labels: robots.txt
>             Fix For: 1.4, 2.0
>
>
> We're about to release the first version of Crawler-Commons [http://code.google.com/p/crawler-commons/] which contains a parser for robots.txt files. This parser should also be better than the one we currently have in Nutch. I will delegate this functionality to CC as soon as it is available publicly

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira