You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@nutch.apache.org by "Sami Siren (JIRA)" <ji...@apache.org> on 2006/12/21 06:10:22 UTC

[jira] Updated: (NUTCH-272) Max. pages to crawl/fetch per site (emergency limit)

     [ http://issues.apache.org/jira/browse/NUTCH-272?page=all ]

Sami Siren updated NUTCH-272:
-----------------------------


>Thanks Doug, that makes more sense now. Running URLFilters.filter() during Generate seems very handy,
>albeit costly for large crawls. (Should have an option to turn off?) 

Url filtering inside generator has been made optional in NUTCH-403

> Max. pages to crawl/fetch per site (emergency limit)
> ----------------------------------------------------
>
>                 Key: NUTCH-272
>                 URL: http://issues.apache.org/jira/browse/NUTCH-272
>             Project: Nutch
>          Issue Type: Improvement
>            Reporter: Stefan Neufeind
>
> If I'm right, there is no way in place right now for setting an "emergency limit" to fetch a certain max. number of pages per site. Is there an "easy" way to implement such a limit, maybe as a plugin?

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira