You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@nutch.apache.org by tavery <ta...@itasoftware.com> on 2007/12/03 22:45:43 UTC

Local file system crawl job error

I'm getting the following error when I try to do a local crawl: 

Exception in thread "main" java.io.IOException: Job failed!
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:604)
        at org.apache.nutch.crawl.Injector.inject(Injector.java:162)
        at org.apache.nutch.crawl.Crawl.main(Crawl.java:115)

I know other folks have run into this. I think some went back to earlier
binaries. Do I need to do  that?  Someone else suggested reconfiguring the
hadoop-site.xml file. I tried a version of the config that worked for him,
but it didn't work for me. Any other suggestions...?

Tyrin
-- 
View this message in context: http://www.nabble.com/Local-file-system-crawl-job-error-tf4939531.html#a14139624
Sent from the Nutch - User mailing list archive at Nabble.com.