You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Steve Lewis <lo...@gmail.com> on 2014/12/04 17:48:26 UTC

Failed to read chunk exception

I am running a large job using 4000 partitions - after running for four
hours on a 16 node cluster it fails with the following message.
The errors are in spark code and seem address unreliability at the level of
the disk -
Anyone seen this and know what is going on and how to fix it.


Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 13 in stage 15.827 failed 4 times, most recent
failure: Lost task 13.3 in stage 15.827 (TID 13386, pltrd022.labs.uninett.no):
java.io.IOException: failed to read chunk

org.xerial.snappy.SnappyInputStream.hasNextChunk(SnappyInputStream.java:348)

org.xerial.snappy.SnappyInputStream.rawRead(SnappyInputStream.java:159)
        org.xerial.snappy.SnappyInputStream.read(SnappyInputStream.java:142)
         .....

Re: Failed to read chunk exception

Posted by rajnish <ra...@gmail.com>.
I am facing the same issue in spark-1.1.0 versions

/12/29 20:44:31 INFO scheduler.TaskSetManager: Starting task 5.0 in stage
1.1 (TID 1373, X.X.X.X , ANY, 2185 bytes)
14/12/29 20:44:31 WARN scheduler.TaskSetManager: Lost task 6.0 in stage 3.0
(TID 1367, iX.X.X.X): java.io.IOException: failed to read chunk
       
org.xerial.snappy.SnappyInputStream.hasNextChunk(SnappyInputStream.java:348)
        org.xerial.snappy.SnappyInputStream.read(SnappyInputStream.java:384)
       
java.io.ObjectInputStream$PeekInputStream.peek(ObjectInputStream.java:2293)
       
java.io.ObjectInputStream$BlockDataInputStream.peek(ObjectInputStream.java:2586)




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Failed-to-read-chunk-exception-tp20374p20891.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org