You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@nutch.apache.org by Ab...@aol.com on 2007/03/27 03:38:54 UTC

log4j:ERROR Failed to flush writer,

Hi
 
I am using Nutch 0.8.1 with  Hadoop.  And also I am using the  recrawl script 
found at
found below: 
_http://wiki.apache.org/nutch/IntranetRecrawl#head-e58e25a0b9530bb6fcdfb282fd27a207fc0aff0_ 
(http://wiki.apache.org/nutch/IntranetRecrawl#head-e58e25a0b9530bb6fcdfb282fd27a207fc0aff0) 
 
I am specifying a dept of 12 when submitting the  recrawl script.
Here is how I invoke the script 
nohup ./crawl.sh /user/nutch/crawl 12  >out  &
 
The Crawl/Generate/Fetch cycle progresses fine until  depth 3. When it is at 
doing dept 4, I see the log4j Error below, written to the  out file. 
 
The crawl does not die. Thanks for any feedback you  can offer.
 
Content of the out file
------------------------
 
Generator: starting
Generator: segment:  /user/nutch/crawl/segments/20070326152932
Generator: Selecting best-scoring  urls due for fetch.
Generator: Partitioning selected urls by host, for  politeness.
Generator: done.
2007-03-26:16:37:23: Fetcher  Starting\n
Fetcher: starting
Fetcher: segment:  /user/nutch/crawl/segments/20070326152932
Fetcher:  done
2007-03-26:18:00:30: Update DB Starting\n
CrawlDb update:  starting
log4j:ERROR Failed to flush writer,
java.io.IOException:  Operation not permitted
at  java.io.FileOutputStream.writeBytes(Native  Method)
at  java.io.FileOutputStream.write(FileOutputStream.java:260)
at  sun.nio.cs.StreamEncoder$CharsetSE.writeBytes(StreamEncoder.java:336)
at  sun.nio.cs.StreamEncoder$CharsetSE.implFlushBuffer(StreamEncoder.java:404)
at  sun.nio.cs.StreamEncoder$CharsetSE.implFlush(StreamEncoder.java:408)
at  sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:152)
at  java.io.OutputStreamWriter.flush(OutputStreamWriter.java:213)
at  org.apache.log4j.helpers.QuietWriter.flush(QuietWriter.java:57)
at  org.apache.log4j.WriterAppender.subAppend(WriterAppender.java:315)
at  
org.apache.log4j.DailyRollingFileAppender.subAppend(DailyRollingFileAppender.java:358)
at  org.apache.log4j.WriterAppender.append(WriterAppender.java:159)
at  org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:230)
at  
org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:65)
at  org.apache.log4j.Category.callAppenders(Category.java:203)
at org.apache.log4j.Category.forcedLog(Category.java:388)
                at  org.apache.log4j.Category.log(Category.java:853)
at  org.apache.commons.logging.impl.Log4JLogger.info(Log4JLogger.java:133)
at  org.apache.nutch.crawl.CrawlDb.update(CrawlDb.java:50)
at org.apache.nutch.crawl.CrawlDb.main(CrawlDb.java:116)
CrawlDb update: db: /user/nutch/crawl/crawldb
CrawlDb update:  segment: /user/nutch/crawl/segments/20070326152932
CrawlDb update: Merging  segment data into db.
CrawlDb update: done





************************************** AOL now offers free email to everyone. 
 Find out more about what's free from AOL at http://www.aol.com.