You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Anuja Jakhade (JIRA)" <ji...@apache.org> on 2019/01/31 11:07:00 UTC

[jira] [Created] (SPARK-26796) Testcases failing with "org.apache.hadoop.fs.ChecksumException" error

Anuja Jakhade created SPARK-26796:
-------------------------------------

             Summary: Testcases failing with "org.apache.hadoop.fs.ChecksumException" error
                 Key: SPARK-26796
                 URL: https://issues.apache.org/jira/browse/SPARK-26796
             Project: Spark
          Issue Type: Bug
          Components: Tests
    Affects Versions: 2.4.0, 2.3.2
         Environment: I am working on Ubuntu 16.04 on s390x.

Java Version

openjdk version "1.8.0_191"
OpenJDK Runtime Environment (build 1.8.0_191-8u191-b12-0ubuntu0.16.04.1-b12)
OpenJDK 64-Bit Zero VM (build 25.191-b12, interpreted mode)

 

Hadoop  Version

Hadoop 2.7.1
Subversion Unknown -r Unknown
Compiled by test on 2019-01-29T09:09Z
Compiled with protoc 2.5.0
From source with checksum 5e94a235f9a71834e2eb73fb36ee873f
This command was run using /home/test/hadoop-release-2.7.1/hadoop-dist/target/hadoop-2.7.1/share/hadoop/common/hadoop-common-2.7.1.jar

 

 

 
            Reporter: Anuja Jakhade


Observing test case failures due to Checksum error 

Below is the error log

[ERROR] checkpointAndComputation(test.org.apache.spark.JavaAPISuite) Time elapsed: 1.232 s <<< ERROR!
org.apache.spark.SparkException: 
Job aborted due to stage failure: Task 0 in stage 2.0 failed 1 times, most recent failure: Lost task 0.0 in stage 2.0 (TID 2, localhost, executor driver): org.apache.hadoop.fs.ChecksumException: Checksum error: file:/home/test/spark/core/target/tmp/1548319689411-0/fd0ba388-539c-49aa-bf76-e7d50aa2d1fc/rdd-0/part-00000 at 0 exp: 222499834 got: 1400184476
 at org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:323)
 at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:279)
 at org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:214)
 at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:232)
 at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:196)
 at java.io.DataInputStream.read(DataInputStream.java:149)
 at java.io.ObjectInputStream$PeekInputStream.read(ObjectInputStream.java:2769)
 at java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2785)
 at java.io.ObjectInputStream$BlockDataInputStream.readShort(ObjectInputStream.java:3262)
 at java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:968)
 at java.io.ObjectInputStream.<init>(ObjectInputStream.java:390)
 at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:63)
 at org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:63)
 at org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:122)
 at org.apache.spark.rdd.ReliableCheckpointRDD$.readCheckpointFile(ReliableCheckpointRDD.scala:300)
 at org.apache.spark.rdd.ReliableCheckpointRDD.compute(ReliableCheckpointRDD.scala:100)
 at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
 at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
 at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:322)
 at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
 at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
 at org.apache.spark.scheduler.Task.run(Task.scala:109)
 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
 at java.lang.Thread.run(Thread.java:813)

Driver stacktrace:
 at test.org.apache.spark.JavaAPISuite.checkpointAndComputation(JavaAPISuite.java:1243)
Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error:

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org