You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@nutch.apache.org by "Julien Nioche (JIRA)" <ji...@apache.org> on 2014/06/17 10:00:20 UTC

[jira] [Updated] (NUTCH-1793) HttpRobotRulesParser not configured properly => "http.robots.403.allow" property is not read

     [ https://issues.apache.org/jira/browse/NUTCH-1793?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Julien Nioche updated NUTCH-1793:
---------------------------------

    Attachment: NUTCH-1793.patch

Will commit shortly unless someone has an objection

> HttpRobotRulesParser not configured properly => "http.robots.403.allow" property is not read
> --------------------------------------------------------------------------------------------
>
>                 Key: NUTCH-1793
>                 URL: https://issues.apache.org/jira/browse/NUTCH-1793
>             Project: Nutch
>          Issue Type: Bug
>          Components: protocol
>    Affects Versions: 1.8
>            Reporter: Julien Nioche
>         Attachments: NUTCH-1793.patch
>
>
> HttpRobotRulesParser is instanciated by HttpBase with an empty constructor and the conf() method is called later but relies on the super implementation and as a result  the  "http.robots.403.allow" property is not read.
> Am OK to leave this property in HttpRobotRulesParser but it should override setConf so that the property is initialised properly.



--
This message was sent by Atlassian JIRA
(v6.2#6252)