You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jem Tucker <je...@gmail.com> on 2015/07/17 09:39:02 UTC

Unread block data error

Hi,

I have been running a batch of data through my application for the last
couple of days and this morning discovered it had fallen over with the
following error.

java.lang.IllegalStateException: unread block data
at
java.io.ObjectInputStream$BlockDataInputStream.setBlockDataMode(ObjectInputStream.java:2376)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1360)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1946)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1870)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
at spark.JavaDeserializationStream.readObject(JavaSerializer.scala:23)
at spark.JavaSerializerInstance.deserialize(JavaSerializer.scala:45)
at spark.executor.Executor$TaskRunner.run(Executor.scala:73)
at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)

I have seen this once before and it turned out to be version clashes,
however this time no dependencies have changed since the last successful
run? I am using Spark 1.2 on CDH 5.3.2

Any ideas would be greatly appreciated!

Thanks,

Jem