You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@nutch.apache.org by "Doğacan Güney (JIRA)" <ji...@apache.org> on 2007/02/15 16:46:05 UTC
[jira] Created: (NUTCH-446) RobotRulesParser should ignore
Crawl-delay values of other bots in robots.txt
RobotRulesParser should ignore Crawl-delay values of other bots in robots.txt
-----------------------------------------------------------------------------
Key: NUTCH-446
URL: https://issues.apache.org/jira/browse/NUTCH-446
Project: Nutch
Issue Type: Bug
Components: fetcher
Affects Versions: 0.9.0
Reporter: Doğacan Güney
Priority: Minor
Fix For: 0.9.0
Attachments: crawl-delay.patch
RobotRulesParser doesn't check for addRules when reading the crawl-delay value, so the nutch bot will get the crawl-delay value of another robot's crawl-delay in robots.txt.
Let me try to be more clear:
User-agent: foobot
Crawl-delay: 3600
User-agent: *
Disallow: /baz
In such a robots.txt file, nutch bot will get 3600 as its crawl-delay
value, no matter what nutch bot's name actually is.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Assigned: (NUTCH-446) RobotRulesParser should ignore
Crawl-delay values of other bots in robots.txt
Posted by "Sami Siren (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/NUTCH-446?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sami Siren reassigned NUTCH-446:
--------------------------------
Assignee: Sami Siren
> RobotRulesParser should ignore Crawl-delay values of other bots in robots.txt
> -----------------------------------------------------------------------------
>
> Key: NUTCH-446
> URL: https://issues.apache.org/jira/browse/NUTCH-446
> Project: Nutch
> Issue Type: Bug
> Components: fetcher
> Affects Versions: 0.9.0
> Reporter: Doğacan Güney
> Assigned To: Sami Siren
> Priority: Minor
> Fix For: 1.0.0
>
> Attachments: crawl-delay.patch, crawl-delay_test.patch
>
>
> RobotRulesParser doesn't check for addRules when reading the crawl-delay value, so the nutch bot will get the crawl-delay value of another robot's crawl-delay in robots.txt.
> Let me try to be more clear:
> User-agent: foobot
> Crawl-delay: 3600
> User-agent: *
> Disallow: /baz
> In such a robots.txt file, nutch bot will get 3600 as its crawl-delay
> value, no matter what nutch bot's name actually is.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Commented: (NUTCH-446) RobotRulesParser should ignore
Crawl-delay values of other bots in robots.txt
Posted by "Doğacan Güney (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/NUTCH-446?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12494734 ]
Doğacan Güney commented on NUTCH-446:
-------------------------------------
So, does anyone have objections to this? It fixes an annoying (albeit rare) bug in which Nutch doesn't fetch pages even though it is alllowed to, or behave too polite/impolite. And it doesn't seem to break anything.
> RobotRulesParser should ignore Crawl-delay values of other bots in robots.txt
> -----------------------------------------------------------------------------
>
> Key: NUTCH-446
> URL: https://issues.apache.org/jira/browse/NUTCH-446
> Project: Nutch
> Issue Type: Bug
> Components: fetcher
> Affects Versions: 0.9.0
> Reporter: Doğacan Güney
> Priority: Minor
> Fix For: 1.0.0
>
> Attachments: crawl-delay.patch, crawl-delay_test.patch
>
>
> RobotRulesParser doesn't check for addRules when reading the crawl-delay value, so the nutch bot will get the crawl-delay value of another robot's crawl-delay in robots.txt.
> Let me try to be more clear:
> User-agent: foobot
> Crawl-delay: 3600
> User-agent: *
> Disallow: /baz
> In such a robots.txt file, nutch bot will get 3600 as its crawl-delay
> value, no matter what nutch bot's name actually is.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Resolved: (NUTCH-446) RobotRulesParser should ignore
Crawl-delay values of other bots in robots.txt
Posted by "Sami Siren (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/NUTCH-446?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sami Siren resolved NUTCH-446.
------------------------------
Resolution: Fixed
I just committed this, keep the patches coming Doğacan!
> RobotRulesParser should ignore Crawl-delay values of other bots in robots.txt
> -----------------------------------------------------------------------------
>
> Key: NUTCH-446
> URL: https://issues.apache.org/jira/browse/NUTCH-446
> Project: Nutch
> Issue Type: Bug
> Components: fetcher
> Affects Versions: 0.9.0
> Reporter: Doğacan Güney
> Assigned To: Sami Siren
> Priority: Minor
> Fix For: 1.0.0
>
> Attachments: crawl-delay.patch, crawl-delay_test.patch
>
>
> RobotRulesParser doesn't check for addRules when reading the crawl-delay value, so the nutch bot will get the crawl-delay value of another robot's crawl-delay in robots.txt.
> Let me try to be more clear:
> User-agent: foobot
> Crawl-delay: 3600
> User-agent: *
> Disallow: /baz
> In such a robots.txt file, nutch bot will get 3600 as its crawl-delay
> value, no matter what nutch bot's name actually is.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Commented: (NUTCH-446) RobotRulesParser should ignore
Crawl-delay values of other bots in robots.txt
Posted by "Sami Siren (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/NUTCH-446?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12492850 ]
Sami Siren commented on NUTCH-446:
----------------------------------
+1
> RobotRulesParser should ignore Crawl-delay values of other bots in robots.txt
> -----------------------------------------------------------------------------
>
> Key: NUTCH-446
> URL: https://issues.apache.org/jira/browse/NUTCH-446
> Project: Nutch
> Issue Type: Bug
> Components: fetcher
> Affects Versions: 0.9.0
> Reporter: Doğacan Güney
> Priority: Minor
> Fix For: 1.0.0
>
> Attachments: crawl-delay.patch, crawl-delay_test.patch
>
>
> RobotRulesParser doesn't check for addRules when reading the crawl-delay value, so the nutch bot will get the crawl-delay value of another robot's crawl-delay in robots.txt.
> Let me try to be more clear:
> User-agent: foobot
> Crawl-delay: 3600
> User-agent: *
> Disallow: /baz
> In such a robots.txt file, nutch bot will get 3600 as its crawl-delay
> value, no matter what nutch bot's name actually is.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Updated: (NUTCH-446) RobotRulesParser should ignore
Crawl-delay values of other bots in robots.txt
Posted by "Doğacan Güney (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/NUTCH-446?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Doğacan Güney updated NUTCH-446:
--------------------------------
Attachment: crawl-delay_test.patch
Test case for crawl delay rules. Nutch fails the test case without crawl-delay patch.
> RobotRulesParser should ignore Crawl-delay values of other bots in robots.txt
> -----------------------------------------------------------------------------
>
> Key: NUTCH-446
> URL: https://issues.apache.org/jira/browse/NUTCH-446
> Project: Nutch
> Issue Type: Bug
> Components: fetcher
> Affects Versions: 0.9.0
> Reporter: Doğacan Güney
> Priority: Minor
> Fix For: 1.0.0
>
> Attachments: crawl-delay.patch, crawl-delay_test.patch
>
>
> RobotRulesParser doesn't check for addRules when reading the crawl-delay value, so the nutch bot will get the crawl-delay value of another robot's crawl-delay in robots.txt.
> Let me try to be more clear:
> User-agent: foobot
> Crawl-delay: 3600
> User-agent: *
> Disallow: /baz
> In such a robots.txt file, nutch bot will get 3600 as its crawl-delay
> value, no matter what nutch bot's name actually is.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Updated: (NUTCH-446) RobotRulesParser should ignore
Crawl-delay values of other bots in robots.txt
Posted by "Doğacan Güney (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/NUTCH-446?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Doğacan Güney updated NUTCH-446:
--------------------------------
Attachment: crawl-delay.patch
> RobotRulesParser should ignore Crawl-delay values of other bots in robots.txt
> -----------------------------------------------------------------------------
>
> Key: NUTCH-446
> URL: https://issues.apache.org/jira/browse/NUTCH-446
> Project: Nutch
> Issue Type: Bug
> Components: fetcher
> Affects Versions: 0.9.0
> Reporter: Doğacan Güney
> Priority: Minor
> Fix For: 0.9.0
>
> Attachments: crawl-delay.patch
>
>
> RobotRulesParser doesn't check for addRules when reading the crawl-delay value, so the nutch bot will get the crawl-delay value of another robot's crawl-delay in robots.txt.
> Let me try to be more clear:
> User-agent: foobot
> Crawl-delay: 3600
> User-agent: *
> Disallow: /baz
> In such a robots.txt file, nutch bot will get 3600 as its crawl-delay
> value, no matter what nutch bot's name actually is.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.