You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@nutch.apache.org by Mick Peters <mi...@gmail.com> on 2009/06/01 10:30:35 UTC

hadoop.log in parallel crawling

Hi dudes, I'm crawling severals sites in parallel and if I crawl with the
default nutch_conf I've problems writting at the same hadoop.log.
Now I create a nutch_conf_dir per site and set a new hadoop.log.dir per site
in log4j.properties, the problem is when i'm running a crawl and export the
new NUTCH_CONF_DIR still writting in the default NUTCH_CONF.
Thanks in advance!
-- 
View this message in context: http://www.nabble.com/hadoop.log-in-parallel-crawling-tp23811444p23811444.html
Sent from the Nutch - User mailing list archive at Nabble.com.