You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@nutch.apache.org by Hugo Alves <hu...@gmail.com> on 2012/08/22 16:29:28 UTC

nutch 2 recrawl question

Hi.

This is a conceptual question, i haven't tried yet.
The nutch version i am using is 2.0.

Suppose that a full crawl has been made and the nutch hsql database is
filled with all the data.
In my perception  if i run nutch again, it should refetch all urls
according to the fetchSchedule rules.
The process responsible for marking the url's for fetching is the
Generator mapper but looking to the code i see
this:(org.apache.nutch.crawl.GeneratorMapper)
if (Mark.GENERATE_MARK.checkMark(page) != null) {
      if (GeneratorJob.LOG.isDebugEnabled()) {
        GeneratorJob.LOG.debug("Skipping " + url + "; already generated");
      }
      return;
    }

The Generate_MARK is allways != null because after the first crawl the
field 'MARKERS' of the database has:
__prsmrk__*1345638110-1053938230_gnmrk_*1345638110-1053938230_ftcmrk_*1345638110-1053938230

So, is my assumption correct or am i missing something?