You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@nutch.apache.org by "Markus Jelsma (JIRA)" <ji...@apache.org> on 2015/12/10 17:01:11 UTC
[jira] [Closed] (NUTCH-1995) Add support for wildcard to
http.robot.rules.whitelist
[ https://issues.apache.org/jira/browse/NUTCH-1995?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Markus Jelsma closed NUTCH-1995.
--------------------------------
Resolution: Fixed
Closing again. It seems there was a older nutch jar laying around. The plugin called a method that didn't exist. Sorry for the fuzz!
> Add support for wildcard to http.robot.rules.whitelist
> ------------------------------------------------------
>
> Key: NUTCH-1995
> URL: https://issues.apache.org/jira/browse/NUTCH-1995
> Project: Nutch
> Issue Type: Improvement
> Components: robots
> Affects Versions: 1.10
> Reporter: Giuseppe Totaro
> Assignee: Giuseppe Totaro
> Labels: memex
> Fix For: 1.11
>
> Attachments: NUTCH-1995.MattmannNagelTotaro.05-26-2015.patch, NUTCH-1995.MattmannNagelTotaro.05-27-2015.patch, NUTCH-1995.MattmannNagelTotaro.patch, NUTCH-1995.patch
>
>
> The {{http.robot.rules.whitelist}} ([NUTCH-1927|https://issues.apache.org/jira/browse/NUTCH-1927]) configuration parameter allows to specify a comma separated list of hostnames or IP addresses to ignore robot rules parsing for.
> Adding support for wildcard in {{http.robot.rules.whitelist}} could be very useful and simplify the configuration, for example, if we need to give many hostnames/addresses. Here is an example:
> {noformat}
> <name>http.robot.rules.whitelist</name>
> <value>*.sample.com</value>
> <description>Comma separated list of hostnames or IP addresses to ignore
> robot rules parsing for. Use with care and only if you are explicitly
> allowed by the site owner to ignore the site's robots.txt!
> </description>
> </property>
> {noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)