You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Qiuzhuang Lian <qi...@gmail.com> on 2014/10/25 03:16:59 UTC

serialVersionUID incompatible error in class BlockManagerId

Hi,

I update git today and when connecting to spark cluster, I got
the serialVersionUID incompatible error in class BlockManagerId.

Here is  the log,

Shouldn't we better give BlockManagerId a constant serialVersionUID  avoid
this?

Thanks,
Qiuzhuang

scala> val rdd = sc.parparallelize(1 to 100014/10/25 09:10:48 ERROR
Remoting: org.apache.spark.storage.BlockManagerId; local class
incompatible: stream classdesc serialVersionUID = 2439208141545036836,
local class serialVersionUID = 4657685702603429489
java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
local class incompatible: stream classdesc serialVersionUID =
2439208141545036836, local class serialVersionUID = 4657685702603429489
        at
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
        at
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
        at
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at
akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
        at
akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
        at scala.util.Try$.apply(Try.scala:161)
        at
akka.serialization.Serialization.deserialize(Serialization.scala:98)
        at
akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
        at
akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
        at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
        at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
        at
akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
        at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
        at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
        at akka.actor.ActorCell.invoke(ActorCell.scala:487)
        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
        at akka.dispatch.Mailbox.run(Mailbox.scala:220)
        at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
        at
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
14/10/25 09:10:48 ERROR SparkDeploySchedulerBackend: Asked to remove non
existant executor 1
0014/10/25 09:11:21 ERROR Remoting:
org.apache.spark.storage.BlockManagerId; local class incompatible: stream
classdesc serialVersionUID = 2439208141545036836, local class
serialVersionUID = 4657685702603429489
java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
local class incompatible: stream classdesc serialVersionUID =
2439208141545036836, local class serialVersionUID = 4657685702603429489
        at
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
        at
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
        at
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at
akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
        at
akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
        at scala.util.Try$.apply(Try.scala:161)
        at
akka.serialization.Serialization.deserialize(Serialization.scala:98)
        at
akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
        at
akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
        at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
        at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
        at
akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
        at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
        at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
        at akka.actor.ActorCell.invoke(ActorCell.scala:487)
        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
        at akka.dispatch.Mailbox.run(Mailbox.scala:220)
        at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
        at
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
14/10/25 09:11:21 ERROR SparkDeploySchedulerBackend: Asked to remove non
existant executor 1
14/10/25 09:11:54 INFO SparkDeploySchedulerBackend: Registered executor:
Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50006/user/Executor#-1410691203]
with ID 1
14/10/25 09:11:54 INFO DAGScheduler: Host added was in lost list earlier:
DEV-02.SpringB.GZ
14/10/25 09:11:55 ERROR TaskSchedulerImpl: Lost executor 1 on
DEV-02.SpringB.GZ: remote Akka client disassociated
14/10/25 09:11:55 WARN ReliableDeliverySupervisor: Association with remote
system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50006] has failed,
address is now gated for [5000] ms. Reason is: [Association failed with
[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50006]].
14/10/25 09:11:55 INFO DAGScheduler: Executor lost: 1 (epoch 1)
14/10/25 09:11:55 INFO BlockManagerMasterActor: Trying to remove executor 1
from BlockManagerMaster.
14/10/25 09:11:55 INFO BlockManagerMaster: Removed 1 successfully in
removeExecutor
14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non
existant executor 1
14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non
existant executor 1
14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:
app-20141025091012-0002/1 is now EXITED (Command exited with code 1)
14/10/25 09:11:55 INFO SparkDeploySchedulerBackend: Executor
app-20141025091012-0002/1 removed: Command exited with code 1
14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non
existant executor 1
14/10/25 09:11:55 INFO AppClient$ClientActor: Executor added:
app-20141025091012-0002/3 on worker-20141025170311-DEV-02.SpringB.GZ-35162
(DEV-02.SpringB.GZ:35162) with 2 cores
14/10/25 09:11:55 INFO SparkDeploySchedulerBackend: Granted executor ID
app-20141025091012-0002/3 on hostPort DEV-02.SpringB.GZ:35162 with 2 cores,
512.0 MB RAM
14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:
app-20141025091012-0002/3 is now LOADING
14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:
app-20141025091012-0002/3 is now RUNNING
14/10/25 09:11:58 INFO SparkDeploySchedulerBackend: Registered executor:
Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50740/user/Executor#1229699385]
with ID 3
14/10/25 09:11:58 WARN ReliableDeliverySupervisor: Association with remote
system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50740] has failed,
address is now gated for [5000] ms. Reason is:
[org.apache.spark.storage.BlockManagerId; local class incompatible: stream
classdesc serialVersionUID = 2439208141545036836, local class
serialVersionUID = 4657685702603429489].
14/10/25 09:11:58 ERROR TaskSchedulerImpl: Lost executor 3 on
DEV-02.SpringB.GZ: remote Akka client disassociated
14/10/25 09:11:58 INFO DAGScheduler: Executor lost: 3 (epoch 2)
14/10/25 09:11:58 INFO BlockManagerMasterActor: Trying to remove executor 3
from BlockManagerMaster.
14/10/25 09:11:58 INFO BlockManagerMaster: Removed 3 successfully in
removeExecutor
14/10/25 09:12:31 ERROR Remoting: org.apache.spark.storage.BlockManagerId;
local class incompatible: stream classdesc serialVersionUID =
2439208141545036836, local class serialVersionUID = 4657685702603429489
java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
local class incompatible: stream classdesc serialVersionUID =
2439208141545036836, local class serialVersionUID = 4657685702603429489
        at
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
        at
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
        at
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at
akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
        at
akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
        at scala.util.Try$.apply(Try.scala:161)
        at
akka.serialization.Serialization.deserialize(Serialization.scala:98)
        at
akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
        at
akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
        at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
        at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
        at
akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
        at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
        at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
        at akka.actor.ActorCell.invoke(ActorCell.scala:487)
        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
        at akka.dispatch.Mailbox.run(Mailbox.scala:220)
        at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
        at
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
14/10/25 09:12:31 ERROR SparkDeploySchedulerBackend: Asked to remove non
existant executor 3
14/10/25 09:13:04 ERROR Remoting: org.apache.spark.storage.BlockManagerId;
local class incompatible: stream classdesc serialVersionUID =
2439208141545036836, local class serialVersionUID = 4657685702603429489
java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
local class incompatible: stream classdesc serialVersionUID =
2439208141545036836, local class serialVersionUID = 4657685702603429489
        at
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
        at
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
        at
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
        at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
        at
akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at
akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
        at
akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
        at scala.util.Try$.apply(Try.scala:161)
        at
akka.serialization.Serialization.deserialize(Serialization.scala:98)
        at
akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
        at
akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
        at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
        at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
        at
akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
        at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
        at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
        at akka.actor.ActorCell.invoke(ActorCell.scala:487)
        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
        at akka.dispatch.Mailbox.run(Mailbox.scala:220)
        at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
        at
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
14/10/25 09:13:04 ERROR SparkDeploySchedulerBackend: Asked to remove non
existant executor 3
14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non
existant executor 3
14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non
existant executor 3
14/10/25 09:13:37 INFO AppClient$ClientActor: Executor updated:
app-20141025091012-0002/3 is now EXITED (Command exited with code 1)
14/10/25 09:13:37 INFO SparkDeploySchedulerBackend: Executor
app-20141025091012-0002/3 removed: Command exited with code 1
14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non
existant executor 3
14/10/25 09:13:37 INFO AppClient$ClientActor: Executor added:
app-20141025091012-0002/4 on worker-20141025170311-DEV-02.SpringB.GZ-35162
(DEV-02.SpringB.GZ:35162) with 2 cores
14/10/25 09:13:37 INFO SparkDeploySchedulerBackend: Granted executor ID
app-20141025091012-0002/4 on hostPort DEV-02.SpringB.GZ:35162 with 2 cores,
512.0 MB RAM
14/10/25 09:13:37 INFO AppClient$ClientActor: Executor updated:
app-20141025091012-0002/4 is now LOADING
14/10/25 09:13:38 INFO AppClient$ClientActor: Executor updated:
app-20141025091012-0002/4 is now RUNNING
14/10/25 09:13:40 INFO SparkDeploySchedulerBackend: Registered executor:
Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:56019/user/Executor#1354626597]
with ID 4
14/10/25 09:13:40 WARN ReliableDeliverySupervisor: Association with remote
system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:56019] has failed,
address is now gated for [5000] ms. Reason is:
[org.apache.spark.storage.BlockManagerId; local class incompatible: stream
classdesc serialVersionUID = 2439208141545036836, local class
serialVersionUID = 4657685702603429489].
14/10/25 09:13:40 ERROR TaskSchedulerImpl: Lost executor 4 on
DEV-02.SpringB.GZ: remote Akka client disassociated
14/10/25 09:13:40 INFO DAGScheduler: Executor lost: 4 (epoch 3)
14/10/25 09:13:40 INFO BlockManagerMasterActor: Trying to remove executor 4
from BlockManagerMaster.
14/10/25 09:13:40 INFO BlockManagerMaster: Removed 4 successfully in
removeExecutor

Re: serialVersionUID incompatible error in class BlockManagerId

Posted by Qiuzhuang Lian <qi...@gmail.com>.
I update git trunk and build in the two linux machines. I think they should
have the same version. I am going to do a force clean build and then retry.

Thanks.


On Sat, Oct 25, 2014 at 9:23 AM, Josh Rosen <ro...@gmail.com> wrote:

> Are all processes (Master, Worker, Executors, Driver) running the same
> Spark build?  This error implies that you’re seeing protocol / binary
> incompatibilities between your Spark driver and cluster.
>
> Spark is API-compatibile across the 1.x series, but we don’t make binary
> link-level compatibility guarantees:
> https://cwiki.apache.org/confluence/display/SPARK/Spark+Versioning+Policy.
> This means that your Spark driver’s runtime classpath should use the same
> version of Spark that’s installed on your cluster.  You can *compile* against
> a different API-compatible version of Spark, but the runtime versions must
> match across all components.
>
> To fix this issue, I’d check that you’ve run the “package” and “assembly”
> phases and that your Spark cluster is using this updated version.
>
> - Josh
>
> On October 24, 2014 at 6:17:26 PM, Qiuzhuang Lian (
> qiuzhuang.lian@gmail.com) wrote:
>
> Hi,
>
> I update git today and when connecting to spark cluster, I got
> the serialVersionUID incompatible error in class BlockManagerId.
>
> Here is the log,
>
> Shouldn't we better give BlockManagerId a constant serialVersionUID avoid
> this?
>
> Thanks,
> Qiuzhuang
>
> scala> val rdd = sc.parparallelize(1 to 100014/10/25 09:10:48 ERROR
> Remoting: org.apache.spark.storage.BlockManagerId; local class
> incompatible: stream classdesc serialVersionUID = 2439208141545036836,
> local class serialVersionUID = 4657685702603429489
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> at
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
> at
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
> at
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>
> at scala.util.Try$.apply(Try.scala:161)
> at
> akka.serialization.Serialization.deserialize(Serialization.scala:98)
> at
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
> at
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
>
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
> at
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
>
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>
> at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>
> at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> 14/10/25 09:10:48 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 1
> 0014/10/25 09:11:21 ERROR Remoting:
> org.apache.spark.storage.BlockManagerId; local class incompatible: stream
> classdesc serialVersionUID = 2439208141545036836, local class
> serialVersionUID = 4657685702603429489
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> at
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
> at
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
> at
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>
> at scala.util.Try$.apply(Try.scala:161)
> at
> akka.serialization.Serialization.deserialize(Serialization.scala:98)
> at
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
> at
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
>
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
> at
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
>
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>
> at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>
> at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> 14/10/25 09:11:21 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 1
> 14/10/25 09:11:54 INFO SparkDeploySchedulerBackend: Registered executor:
> Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50006/user/Executor#-1410691203]
>
> with ID 1
> 14/10/25 09:11:54 INFO DAGScheduler: Host added was in lost list earlier:
> DEV-02.SpringB.GZ
> 14/10/25 09:11:55 ERROR TaskSchedulerImpl: Lost executor 1 on
> DEV-02.SpringB.GZ: remote Akka client disassociated
> 14/10/25 09:11:55 WARN ReliableDeliverySupervisor: Association with remote
> system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50006] has failed,
> address is now gated for [5000] ms. Reason is: [Association failed with
> [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50006]].
> 14/10/25 09:11:55 INFO DAGScheduler: Executor lost: 1 (epoch 1)
> 14/10/25 09:11:55 INFO BlockManagerMasterActor: Trying to remove executor
> 1
> from BlockManagerMaster.
> 14/10/25 09:11:55 INFO BlockManagerMaster: Removed 1 successfully in
> removeExecutor
> 14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 1
> 14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 1
> 14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:
> app-20141025091012-0002/1 is now EXITED (Command exited with code 1)
> 14/10/25 09:11:55 INFO SparkDeploySchedulerBackend: Executor
> app-20141025091012-0002/1 removed: Command exited with code 1
> 14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 1
> 14/10/25 09:11:55 INFO AppClient$ClientActor: Executor added:
> app-20141025091012-0002/3 on worker-20141025170311-DEV-02.SpringB.GZ-35162
> (DEV-02.SpringB.GZ:35162) with 2 cores
> 14/10/25 09:11:55 INFO SparkDeploySchedulerBackend: Granted executor ID
> app-20141025091012-0002/3 on hostPort DEV-02.SpringB.GZ:35162 with 2
> cores,
> 512.0 MB RAM
> 14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:
> app-20141025091012-0002/3 is now LOADING
> 14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:
> app-20141025091012-0002/3 is now RUNNING
> 14/10/25 09:11:58 INFO SparkDeploySchedulerBackend: Registered executor:
> Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50740/user/Executor#1229699385]
>
> with ID 3
> 14/10/25 09:11:58 WARN ReliableDeliverySupervisor: Association with remote
> system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50740] has failed,
> address is now gated for [5000] ms. Reason is:
> [org.apache.spark.storage.BlockManagerId; local class incompatible: stream
> classdesc serialVersionUID = 2439208141545036836, local class
> serialVersionUID = 4657685702603429489].
> 14/10/25 09:11:58 ERROR TaskSchedulerImpl: Lost executor 3 on
> DEV-02.SpringB.GZ: remote Akka client disassociated
> 14/10/25 09:11:58 INFO DAGScheduler: Executor lost: 3 (epoch 2)
> 14/10/25 09:11:58 INFO BlockManagerMasterActor: Trying to remove executor
> 3
> from BlockManagerMaster.
> 14/10/25 09:11:58 INFO BlockManagerMaster: Removed 3 successfully in
> removeExecutor
> 14/10/25 09:12:31 ERROR Remoting: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> at
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
> at
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
> at
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>
> at scala.util.Try$.apply(Try.scala:161)
> at
> akka.serialization.Serialization.deserialize(Serialization.scala:98)
> at
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
> at
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
>
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
> at
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
>
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>
> at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>
> at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> 14/10/25 09:12:31 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 3
> 14/10/25 09:13:04 ERROR Remoting: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> at
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
> at
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
> at
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>
> at scala.util.Try$.apply(Try.scala:161)
> at
> akka.serialization.Serialization.deserialize(Serialization.scala:98)
> at
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
> at
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
>
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
> at
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
>
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>
> at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>
> at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> 14/10/25 09:13:04 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 3
> 14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 3
> 14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 3
> 14/10/25 09:13:37 INFO AppClient$ClientActor: Executor updated:
> app-20141025091012-0002/3 is now EXITED (Command exited with code 1)
> 14/10/25 09:13:37 INFO SparkDeploySchedulerBackend: Executor
> app-20141025091012-0002/3 removed: Command exited with code 1
> 14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 3
> 14/10/25 09:13:37 INFO AppClient$ClientActor: Executor added:
> app-20141025091012-0002/4 on worker-20141025170311-DEV-02.SpringB.GZ-35162
> (DEV-02.SpringB.GZ:35162) with 2 cores
> 14/10/25 09:13:37 INFO SparkDeploySchedulerBackend: Granted executor ID
> app-20141025091012-0002/4 on hostPort DEV-02.SpringB.GZ:35162 with 2
> cores,
> 512.0 MB RAM
> 14/10/25 09:13:37 INFO AppClient$ClientActor: Executor updated:
> app-20141025091012-0002/4 is now LOADING
> 14/10/25 09:13:38 INFO AppClient$ClientActor: Executor updated:
> app-20141025091012-0002/4 is now RUNNING
> 14/10/25 09:13:40 INFO SparkDeploySchedulerBackend: Registered executor:
> Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:56019/user/Executor#1354626597]
>
> with ID 4
> 14/10/25 09:13:40 WARN ReliableDeliverySupervisor: Association with remote
> system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:56019] has failed,
> address is now gated for [5000] ms. Reason is:
> [org.apache.spark.storage.BlockManagerId; local class incompatible: stream
> classdesc serialVersionUID = 2439208141545036836, local class
> serialVersionUID = 4657685702603429489].
> 14/10/25 09:13:40 ERROR TaskSchedulerImpl: Lost executor 4 on
> DEV-02.SpringB.GZ: remote Akka client disassociated
> 14/10/25 09:13:40 INFO DAGScheduler: Executor lost: 4 (epoch 3)
> 14/10/25 09:13:40 INFO BlockManagerMasterActor: Trying to remove executor
> 4
> from BlockManagerMaster.
> 14/10/25 09:13:40 INFO BlockManagerMaster: Removed 4 successfully in
> removeExecutor
>
>

Re: serialVersionUID incompatible error in class BlockManagerId

Posted by Qiuzhuang Lian <qi...@gmail.com>.
After I do a clean rebuild. It works now.

Thanks,
Qiuzhuang

On Sat, Oct 25, 2014 at 9:42 AM, Nan Zhu <zh...@gmail.com> wrote:

>  According to my experience, there are more issues rather than
> BlockManager when you try to run spark application whose build version is
> different with your cluster….
>
> I once tried to make jdbc server build with branch-jdbc-1.0 run with a
> branch-1.0 cluster…no workaround exits…just had to replace cluster jar with
> branch-jdbc-1.0 jar file…..
>
> Best,
>
> --
> Nan Zhu
>
> On Friday, October 24, 2014 at 9:23 PM, Josh Rosen wrote:
>
> Are all processes (Master, Worker, Executors, Driver) running the same
> Spark build?  This error implies that you’re seeing protocol / binary
> incompatibilities between your Spark driver and cluster.
>
> Spark is API-compatibile across the 1.x series, but we don’t make binary
> link-level compatibility guarantees:
> https://cwiki.apache.org/confluence/display/SPARK/Spark+Versioning+Policy.
> This means that your Spark driver’s runtime classpath should use the same
> version of Spark that’s installed on your cluster.  You can compile against
> a different API-compatible version of Spark, but the runtime versions must
> match across all components.
>
> To fix this issue, I’d check that you’ve run the “package” and “assembly”
> phases and that your Spark cluster is using this updated version.
>
> - Josh
>
> On October 24, 2014 at 6:17:26 PM, Qiuzhuang Lian (
> qiuzhuang.lian@gmail.com) wrote:
>
> Hi,
>
> I update git today and when connecting to spark cluster, I got
> the serialVersionUID incompatible error in class BlockManagerId.
>
> Here is the log,
>
> Shouldn't we better give BlockManagerId a constant serialVersionUID avoid
> this?
>
> Thanks,
> Qiuzhuang
>
> scala> val rdd = sc.parparallelize(1 to 100014/10/25 09:10:48 ERROR
> Remoting: org.apache.spark.storage.BlockManagerId; local class
> incompatible: stream classdesc serialVersionUID = 2439208141545036836,
> local class serialVersionUID = 4657685702603429489
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> at
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
> at
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
> at
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>
> at scala.util.Try$.apply(Try.scala:161)
> at
> akka.serialization.Serialization.deserialize(Serialization.scala:98)
> at
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
> at
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
>
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
> at
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
>
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>
> at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>
> at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> 14/10/25 09:10:48 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 1
> 0014/10/25 09:11:21 ERROR Remoting:
> org.apache.spark.storage.BlockManagerId; local class incompatible: stream
> classdesc serialVersionUID = 2439208141545036836, local class
> serialVersionUID = 4657685702603429489
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> at
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
> at
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
> at
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>
> at scala.util.Try$.apply(Try.scala:161)
> at
> akka.serialization.Serialization.deserialize(Serialization.scala:98)
> at
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
> at
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
>
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
> at
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
>
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>
> at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>
> at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> 14/10/25 09:11:21 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 1
> 14/10/25 09:11:54 INFO SparkDeploySchedulerBackend: Registered executor:
> Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50006/user/Executor#-1410691203]
>
> with ID 1
> 14/10/25 09:11:54 INFO DAGScheduler: Host added was in lost list earlier:
> DEV-02.SpringB.GZ
> 14/10/25 09:11:55 ERROR TaskSchedulerImpl: Lost executor 1 on
> DEV-02.SpringB.GZ: remote Akka client disassociated
> 14/10/25 09:11:55 WARN ReliableDeliverySupervisor: Association with remote
> system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50006] has failed,
> address is now gated for [5000] ms. Reason is: [Association failed with
> [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50006]].
> 14/10/25 09:11:55 INFO DAGScheduler: Executor lost: 1 (epoch 1)
> 14/10/25 09:11:55 INFO BlockManagerMasterActor: Trying to remove executor
> 1
> from BlockManagerMaster.
> 14/10/25 09:11:55 INFO BlockManagerMaster: Removed 1 successfully in
> removeExecutor
> 14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 1
> 14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 1
> 14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:
> app-20141025091012-0002/1 is now EXITED (Command exited with code 1)
> 14/10/25 09:11:55 INFO SparkDeploySchedulerBackend: Executor
> app-20141025091012-0002/1 removed: Command exited with code 1
> 14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 1
> 14/10/25 09:11:55 INFO AppClient$ClientActor: Executor added:
> app-20141025091012-0002/3 on worker-20141025170311-DEV-02.SpringB.GZ-35162
> (DEV-02.SpringB.GZ:35162) with 2 cores
> 14/10/25 09:11:55 INFO SparkDeploySchedulerBackend: Granted executor ID
> app-20141025091012-0002/3 on hostPort DEV-02.SpringB.GZ:35162 with 2
> cores,
> 512.0 MB RAM
> 14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:
> app-20141025091012-0002/3 is now LOADING
> 14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:
> app-20141025091012-0002/3 is now RUNNING
> 14/10/25 09:11:58 INFO SparkDeploySchedulerBackend: Registered executor:
> Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50740/user/Executor#1229699385]
>
> with ID 3
> 14/10/25 09:11:58 WARN ReliableDeliverySupervisor: Association with remote
> system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50740] has failed,
> address is now gated for [5000] ms. Reason is:
> [org.apache.spark.storage.BlockManagerId; local class incompatible: stream
> classdesc serialVersionUID = 2439208141545036836, local class
> serialVersionUID = 4657685702603429489].
> 14/10/25 09:11:58 ERROR TaskSchedulerImpl: Lost executor 3 on
> DEV-02.SpringB.GZ: remote Akka client disassociated
> 14/10/25 09:11:58 INFO DAGScheduler: Executor lost: 3 (epoch 2)
> 14/10/25 09:11:58 INFO BlockManagerMasterActor: Trying to remove executor
> 3
> from BlockManagerMaster.
> 14/10/25 09:11:58 INFO BlockManagerMaster: Removed 3 successfully in
> removeExecutor
> 14/10/25 09:12:31 ERROR Remoting: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> at
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
> at
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
> at
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>
> at scala.util.Try$.apply(Try.scala:161)
> at
> akka.serialization.Serialization.deserialize(Serialization.scala:98)
> at
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
> at
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
>
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
> at
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
>
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>
> at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>
> at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> 14/10/25 09:12:31 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 3
> 14/10/25 09:13:04 ERROR Remoting: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;
> local class incompatible: stream classdesc serialVersionUID =
> 2439208141545036836, local class serialVersionUID = 4657685702603429489
> at
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
> at
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
> at
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>
> at scala.util.Try$.apply(Try.scala:161)
> at
> akka.serialization.Serialization.deserialize(Serialization.scala:98)
> at
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
> at
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
>
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
> at
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
>
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>
> at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>
> at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> 14/10/25 09:13:04 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 3
> 14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 3
> 14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 3
> 14/10/25 09:13:37 INFO AppClient$ClientActor: Executor updated:
> app-20141025091012-0002/3 is now EXITED (Command exited with code 1)
> 14/10/25 09:13:37 INFO SparkDeploySchedulerBackend: Executor
> app-20141025091012-0002/3 removed: Command exited with code 1
> 14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non
> existant executor 3
> 14/10/25 09:13:37 INFO AppClient$ClientActor: Executor added:
> app-20141025091012-0002/4 on worker-20141025170311-DEV-02.SpringB.GZ-35162
> (DEV-02.SpringB.GZ:35162) with 2 cores
> 14/10/25 09:13:37 INFO SparkDeploySchedulerBackend: Granted executor ID
> app-20141025091012-0002/4 on hostPort DEV-02.SpringB.GZ:35162 with 2
> cores,
> 512.0 MB RAM
> 14/10/25 09:13:37 INFO AppClient$ClientActor: Executor updated:
> app-20141025091012-0002/4 is now LOADING
> 14/10/25 09:13:38 INFO AppClient$ClientActor: Executor updated:
> app-20141025091012-0002/4 is now RUNNING
> 14/10/25 09:13:40 INFO SparkDeploySchedulerBackend: Registered executor:
> Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:56019/user/Executor#1354626597]
>
> with ID 4
> 14/10/25 09:13:40 WARN ReliableDeliverySupervisor: Association with remote
> system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:56019] has failed,
> address is now gated for [5000] ms. Reason is:
> [org.apache.spark.storage.BlockManagerId; local class incompatible: stream
> classdesc serialVersionUID = 2439208141545036836, local class
> serialVersionUID = 4657685702603429489].
> 14/10/25 09:13:40 ERROR TaskSchedulerImpl: Lost executor 4 on
> DEV-02.SpringB.GZ: remote Akka client disassociated
> 14/10/25 09:13:40 INFO DAGScheduler: Executor lost: 4 (epoch 3)
> 14/10/25 09:13:40 INFO BlockManagerMasterActor: Trying to remove executor
> 4
> from BlockManagerMaster.
> 14/10/25 09:13:40 INFO BlockManagerMaster: Removed 4 successfully in
> removeExecutor
>
>
>

Re: serialVersionUID incompatible error in class BlockManagerId

Posted by Nan Zhu <zh...@gmail.com>.
According to my experience, there are more issues rather than BlockManager when you try to run spark application whose build version is different with your cluster….  

I once tried to make jdbc server build with branch-jdbc-1.0 run with a branch-1.0 cluster…no workaround exits…just had to replace cluster jar with branch-jdbc-1.0 jar file…..

Best,  

--  
Nan Zhu


On Friday, October 24, 2014 at 9:23 PM, Josh Rosen wrote:

> Are all processes (Master, Worker, Executors, Driver) running the same Spark build?  This error implies that you’re seeing protocol / binary incompatibilities between your Spark driver and cluster.
>  
> Spark is API-compatibile across the 1.x series, but we don’t make binary link-level compatibility guarantees: https://cwiki.apache.org/confluence/display/SPARK/Spark+Versioning+Policy.  This means that your Spark driver’s runtime classpath should use the same version of Spark that’s installed on your cluster.  You can compile against a different API-compatible version of Spark, but the runtime versions must match across all components.
>  
> To fix this issue, I’d check that you’ve run the “package” and “assembly” phases and that your Spark cluster is using this updated version.
>  
> - Josh
>  
> On October 24, 2014 at 6:17:26 PM, Qiuzhuang Lian (qiuzhuang.lian@gmail.com (mailto:qiuzhuang.lian@gmail.com)) wrote:
>  
> Hi,  
>  
> I update git today and when connecting to spark cluster, I got  
> the serialVersionUID incompatible error in class BlockManagerId.  
>  
> Here is the log,  
>  
> Shouldn't we better give BlockManagerId a constant serialVersionUID avoid  
> this?  
>  
> Thanks,  
> Qiuzhuang  
>  
> scala> val rdd = sc.parparallelize(1 to 100014/10/25 09:10:48 ERROR  
> Remoting: org.apache.spark.storage.BlockManagerId; local class  
> incompatible: stream classdesc serialVersionUID = 2439208141545036836,  
> local class serialVersionUID = 4657685702603429489  
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;  
> local class incompatible: stream classdesc serialVersionUID =  
> 2439208141545036836, local class serialVersionUID = 4657685702603429489  
> at  
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)  
> at  
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)  
> at  
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)  
> at  
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)  
> at  
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
> at  
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)  
> at  
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)  
> at  
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)  
> at  
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)  
> at  
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)  
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)  
> at  
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)  
> at  
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)  
> at scala.util.Try$.apply(Try.scala:161)  
> at  
> akka.serialization.Serialization.deserialize(Serialization.scala:98)  
> at  
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)  
> at  
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)  
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)  
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)  
> at  
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)  
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)  
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)  
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)  
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)  
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)  
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)  
> at  
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)  
> at  
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)  
> at  
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)  
> at  
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)  
> at  
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)  
> 14/10/25 09:10:48 ERROR SparkDeploySchedulerBackend: Asked to remove non  
> existant executor 1  
> 0014/10/25 09:11:21 ERROR Remoting:  
> org.apache.spark.storage.BlockManagerId; local class incompatible: stream  
> classdesc serialVersionUID = 2439208141545036836, local class  
> serialVersionUID = 4657685702603429489  
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;  
> local class incompatible: stream classdesc serialVersionUID =  
> 2439208141545036836, local class serialVersionUID = 4657685702603429489  
> at  
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)  
> at  
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)  
> at  
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)  
> at  
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)  
> at  
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
> at  
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)  
> at  
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)  
> at  
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)  
> at  
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)  
> at  
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)  
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)  
> at  
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)  
> at  
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)  
> at scala.util.Try$.apply(Try.scala:161)  
> at  
> akka.serialization.Serialization.deserialize(Serialization.scala:98)  
> at  
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)  
> at  
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)  
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)  
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)  
> at  
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)  
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)  
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)  
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)  
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)  
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)  
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)  
> at  
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)  
> at  
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)  
> at  
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)  
> at  
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)  
> at  
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)  
> 14/10/25 09:11:21 ERROR SparkDeploySchedulerBackend: Asked to remove non  
> existant executor 1  
> 14/10/25 09:11:54 INFO SparkDeploySchedulerBackend: Registered executor:  
> Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ (mailto:sparkExecutor@DEV-02.SpringB.GZ):50006/user/Executor#-1410691203]  
> with ID 1  
> 14/10/25 09:11:54 INFO DAGScheduler: Host added was in lost list earlier:  
> DEV-02.SpringB.GZ  
> 14/10/25 09:11:55 ERROR TaskSchedulerImpl: Lost executor 1 on  
> DEV-02.SpringB.GZ: remote Akka client disassociated  
> 14/10/25 09:11:55 WARN ReliableDeliverySupervisor: Association with remote  
> system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ (mailto:sparkExecutor@DEV-02.SpringB.GZ):50006] has failed,  
> address is now gated for [5000] ms. Reason is: [Association failed with  
> [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ (mailto:sparkExecutor@DEV-02.SpringB.GZ):50006]].  
> 14/10/25 09:11:55 INFO DAGScheduler: Executor lost: 1 (epoch 1)  
> 14/10/25 09:11:55 INFO BlockManagerMasterActor: Trying to remove executor 1  
> from BlockManagerMaster.  
> 14/10/25 09:11:55 INFO BlockManagerMaster: Removed 1 successfully in  
> removeExecutor  
> 14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non  
> existant executor 1  
> 14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non  
> existant executor 1  
> 14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:  
> app-20141025091012-0002/1 is now EXITED (Command exited with code 1)  
> 14/10/25 09:11:55 INFO SparkDeploySchedulerBackend: Executor  
> app-20141025091012-0002/1 removed: Command exited with code 1  
> 14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non  
> existant executor 1  
> 14/10/25 09:11:55 INFO AppClient$ClientActor: Executor added:  
> app-20141025091012-0002/3 on worker-20141025170311-DEV-02.SpringB.GZ-35162  
> (DEV-02.SpringB.GZ:35162) with 2 cores  
> 14/10/25 09:11:55 INFO SparkDeploySchedulerBackend: Granted executor ID  
> app-20141025091012-0002/3 on hostPort DEV-02.SpringB.GZ:35162 with 2 cores,  
> 512.0 MB RAM  
> 14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:  
> app-20141025091012-0002/3 is now LOADING  
> 14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:  
> app-20141025091012-0002/3 is now RUNNING  
> 14/10/25 09:11:58 INFO SparkDeploySchedulerBackend: Registered executor:  
> Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ (mailto:sparkExecutor@DEV-02.SpringB.GZ):50740/user/Executor#1229699385]  
> with ID 3  
> 14/10/25 09:11:58 WARN ReliableDeliverySupervisor: Association with remote  
> system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ (mailto:sparkExecutor@DEV-02.SpringB.GZ):50740] has failed,  
> address is now gated for [5000] ms. Reason is:  
> [org.apache.spark.storage.BlockManagerId; local class incompatible: stream  
> classdesc serialVersionUID = 2439208141545036836, local class  
> serialVersionUID = 4657685702603429489].  
> 14/10/25 09:11:58 ERROR TaskSchedulerImpl: Lost executor 3 on  
> DEV-02.SpringB.GZ: remote Akka client disassociated  
> 14/10/25 09:11:58 INFO DAGScheduler: Executor lost: 3 (epoch 2)  
> 14/10/25 09:11:58 INFO BlockManagerMasterActor: Trying to remove executor 3  
> from BlockManagerMaster.  
> 14/10/25 09:11:58 INFO BlockManagerMaster: Removed 3 successfully in  
> removeExecutor  
> 14/10/25 09:12:31 ERROR Remoting: org.apache.spark.storage.BlockManagerId;  
> local class incompatible: stream classdesc serialVersionUID =  
> 2439208141545036836, local class serialVersionUID = 4657685702603429489  
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;  
> local class incompatible: stream classdesc serialVersionUID =  
> 2439208141545036836, local class serialVersionUID = 4657685702603429489  
> at  
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)  
> at  
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)  
> at  
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)  
> at  
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)  
> at  
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
> at  
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)  
> at  
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)  
> at  
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)  
> at  
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)  
> at  
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)  
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)  
> at  
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)  
> at  
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)  
> at scala.util.Try$.apply(Try.scala:161)  
> at  
> akka.serialization.Serialization.deserialize(Serialization.scala:98)  
> at  
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)  
> at  
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)  
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)  
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)  
> at  
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)  
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)  
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)  
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)  
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)  
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)  
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)  
> at  
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)  
> at  
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)  
> at  
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)  
> at  
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)  
> at  
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)  
> 14/10/25 09:12:31 ERROR SparkDeploySchedulerBackend: Asked to remove non  
> existant executor 3  
> 14/10/25 09:13:04 ERROR Remoting: org.apache.spark.storage.BlockManagerId;  
> local class incompatible: stream classdesc serialVersionUID =  
> 2439208141545036836, local class serialVersionUID = 4657685702603429489  
> java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;  
> local class incompatible: stream classdesc serialVersionUID =  
> 2439208141545036836, local class serialVersionUID = 4657685702603429489  
> at  
> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)  
> at  
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)  
> at  
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)  
> at  
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)  
> at  
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
> at  
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)  
> at  
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)  
> at  
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)  
> at  
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)  
> at  
> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)  
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)  
> at  
> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)  
> at  
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)  
> at scala.util.Try$.apply(Try.scala:161)  
> at  
> akka.serialization.Serialization.deserialize(Serialization.scala:98)  
> at  
> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)  
> at  
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)  
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)  
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)  
> at  
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)  
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)  
> at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)  
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)  
> at akka.actor.ActorCell.invoke(ActorCell.scala:487)  
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)  
> at akka.dispatch.Mailbox.run(Mailbox.scala:220)  
> at  
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)  
> at  
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)  
> at  
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)  
> at  
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)  
> at  
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)  
> 14/10/25 09:13:04 ERROR SparkDeploySchedulerBackend: Asked to remove non  
> existant executor 3  
> 14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non  
> existant executor 3  
> 14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non  
> existant executor 3  
> 14/10/25 09:13:37 INFO AppClient$ClientActor: Executor updated:  
> app-20141025091012-0002/3 is now EXITED (Command exited with code 1)  
> 14/10/25 09:13:37 INFO SparkDeploySchedulerBackend: Executor  
> app-20141025091012-0002/3 removed: Command exited with code 1  
> 14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non  
> existant executor 3  
> 14/10/25 09:13:37 INFO AppClient$ClientActor: Executor added:  
> app-20141025091012-0002/4 on worker-20141025170311-DEV-02.SpringB.GZ-35162  
> (DEV-02.SpringB.GZ:35162) with 2 cores  
> 14/10/25 09:13:37 INFO SparkDeploySchedulerBackend: Granted executor ID  
> app-20141025091012-0002/4 on hostPort DEV-02.SpringB.GZ:35162 with 2 cores,  
> 512.0 MB RAM  
> 14/10/25 09:13:37 INFO AppClient$ClientActor: Executor updated:  
> app-20141025091012-0002/4 is now LOADING  
> 14/10/25 09:13:38 INFO AppClient$ClientActor: Executor updated:  
> app-20141025091012-0002/4 is now RUNNING  
> 14/10/25 09:13:40 INFO SparkDeploySchedulerBackend: Registered executor:  
> Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ (mailto:sparkExecutor@DEV-02.SpringB.GZ):56019/user/Executor#1354626597]  
> with ID 4  
> 14/10/25 09:13:40 WARN ReliableDeliverySupervisor: Association with remote  
> system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ (mailto:sparkExecutor@DEV-02.SpringB.GZ):56019] has failed,  
> address is now gated for [5000] ms. Reason is:  
> [org.apache.spark.storage.BlockManagerId; local class incompatible: stream  
> classdesc serialVersionUID = 2439208141545036836, local class  
> serialVersionUID = 4657685702603429489].  
> 14/10/25 09:13:40 ERROR TaskSchedulerImpl: Lost executor 4 on  
> DEV-02.SpringB.GZ: remote Akka client disassociated  
> 14/10/25 09:13:40 INFO DAGScheduler: Executor lost: 4 (epoch 3)  
> 14/10/25 09:13:40 INFO BlockManagerMasterActor: Trying to remove executor 4  
> from BlockManagerMaster.  
> 14/10/25 09:13:40 INFO BlockManagerMaster: Removed 4 successfully in  
> removeExecutor  
>  
>  



Re: serialVersionUID incompatible error in class BlockManagerId

Posted by Josh Rosen <ro...@gmail.com>.
Are all processes (Master, Worker, Executors, Driver) running the same Spark build?  This error implies that you’re seeing protocol / binary incompatibilities between your Spark driver and cluster.

Spark is API-compatibile across the 1.x series, but we don’t make binary link-level compatibility guarantees: https://cwiki.apache.org/confluence/display/SPARK/Spark+Versioning+Policy.  This means that your Spark driver’s runtime classpath should use the same version of Spark that’s installed on your cluster.  You can compile against a different API-compatible version of Spark, but the runtime versions must match across all components.

To fix this issue, I’d check that you’ve run the “package” and “assembly” phases and that your Spark cluster is using this updated version.

- Josh

On October 24, 2014 at 6:17:26 PM, Qiuzhuang Lian (qiuzhuang.lian@gmail.com) wrote:

Hi,  

I update git today and when connecting to spark cluster, I got  
the serialVersionUID incompatible error in class BlockManagerId.  

Here is the log,  

Shouldn't we better give BlockManagerId a constant serialVersionUID avoid  
this?  

Thanks,  
Qiuzhuang  

scala> val rdd = sc.parparallelize(1 to 100014/10/25 09:10:48 ERROR  
Remoting: org.apache.spark.storage.BlockManagerId; local class  
incompatible: stream classdesc serialVersionUID = 2439208141545036836,  
local class serialVersionUID = 4657685702603429489  
java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;  
local class incompatible: stream classdesc serialVersionUID =  
2439208141545036836, local class serialVersionUID = 4657685702603429489  
at  
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)  
at  
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)  
at  
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)  
at  
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)  
at  
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
at  
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)  
at  
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)  
at  
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)  
at  
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)  
at  
akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)  
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)  
at  
akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)  
at  
akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)  
at scala.util.Try$.apply(Try.scala:161)  
at  
akka.serialization.Serialization.deserialize(Serialization.scala:98)  
at  
akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)  
at  
akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)  
at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)  
at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)  
at  
akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)  
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)  
at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)  
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)  
at akka.actor.ActorCell.invoke(ActorCell.scala:487)  
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)  
at akka.dispatch.Mailbox.run(Mailbox.scala:220)  
at  
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)  
at  
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)  
at  
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)  
at  
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)  
at  
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)  
14/10/25 09:10:48 ERROR SparkDeploySchedulerBackend: Asked to remove non  
existant executor 1  
0014/10/25 09:11:21 ERROR Remoting:  
org.apache.spark.storage.BlockManagerId; local class incompatible: stream  
classdesc serialVersionUID = 2439208141545036836, local class  
serialVersionUID = 4657685702603429489  
java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;  
local class incompatible: stream classdesc serialVersionUID =  
2439208141545036836, local class serialVersionUID = 4657685702603429489  
at  
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)  
at  
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)  
at  
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)  
at  
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)  
at  
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
at  
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)  
at  
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)  
at  
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)  
at  
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)  
at  
akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)  
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)  
at  
akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)  
at  
akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)  
at scala.util.Try$.apply(Try.scala:161)  
at  
akka.serialization.Serialization.deserialize(Serialization.scala:98)  
at  
akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)  
at  
akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)  
at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)  
at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)  
at  
akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)  
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)  
at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)  
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)  
at akka.actor.ActorCell.invoke(ActorCell.scala:487)  
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)  
at akka.dispatch.Mailbox.run(Mailbox.scala:220)  
at  
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)  
at  
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)  
at  
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)  
at  
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)  
at  
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)  
14/10/25 09:11:21 ERROR SparkDeploySchedulerBackend: Asked to remove non  
existant executor 1  
14/10/25 09:11:54 INFO SparkDeploySchedulerBackend: Registered executor:  
Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50006/user/Executor#-1410691203]  
with ID 1  
14/10/25 09:11:54 INFO DAGScheduler: Host added was in lost list earlier:  
DEV-02.SpringB.GZ  
14/10/25 09:11:55 ERROR TaskSchedulerImpl: Lost executor 1 on  
DEV-02.SpringB.GZ: remote Akka client disassociated  
14/10/25 09:11:55 WARN ReliableDeliverySupervisor: Association with remote  
system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50006] has failed,  
address is now gated for [5000] ms. Reason is: [Association failed with  
[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50006]].  
14/10/25 09:11:55 INFO DAGScheduler: Executor lost: 1 (epoch 1)  
14/10/25 09:11:55 INFO BlockManagerMasterActor: Trying to remove executor 1  
from BlockManagerMaster.  
14/10/25 09:11:55 INFO BlockManagerMaster: Removed 1 successfully in  
removeExecutor  
14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non  
existant executor 1  
14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non  
existant executor 1  
14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:  
app-20141025091012-0002/1 is now EXITED (Command exited with code 1)  
14/10/25 09:11:55 INFO SparkDeploySchedulerBackend: Executor  
app-20141025091012-0002/1 removed: Command exited with code 1  
14/10/25 09:11:55 ERROR SparkDeploySchedulerBackend: Asked to remove non  
existant executor 1  
14/10/25 09:11:55 INFO AppClient$ClientActor: Executor added:  
app-20141025091012-0002/3 on worker-20141025170311-DEV-02.SpringB.GZ-35162  
(DEV-02.SpringB.GZ:35162) with 2 cores  
14/10/25 09:11:55 INFO SparkDeploySchedulerBackend: Granted executor ID  
app-20141025091012-0002/3 on hostPort DEV-02.SpringB.GZ:35162 with 2 cores,  
512.0 MB RAM  
14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:  
app-20141025091012-0002/3 is now LOADING  
14/10/25 09:11:55 INFO AppClient$ClientActor: Executor updated:  
app-20141025091012-0002/3 is now RUNNING  
14/10/25 09:11:58 INFO SparkDeploySchedulerBackend: Registered executor:  
Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50740/user/Executor#1229699385]  
with ID 3  
14/10/25 09:11:58 WARN ReliableDeliverySupervisor: Association with remote  
system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:50740] has failed,  
address is now gated for [5000] ms. Reason is:  
[org.apache.spark.storage.BlockManagerId; local class incompatible: stream  
classdesc serialVersionUID = 2439208141545036836, local class  
serialVersionUID = 4657685702603429489].  
14/10/25 09:11:58 ERROR TaskSchedulerImpl: Lost executor 3 on  
DEV-02.SpringB.GZ: remote Akka client disassociated  
14/10/25 09:11:58 INFO DAGScheduler: Executor lost: 3 (epoch 2)  
14/10/25 09:11:58 INFO BlockManagerMasterActor: Trying to remove executor 3  
from BlockManagerMaster.  
14/10/25 09:11:58 INFO BlockManagerMaster: Removed 3 successfully in  
removeExecutor  
14/10/25 09:12:31 ERROR Remoting: org.apache.spark.storage.BlockManagerId;  
local class incompatible: stream classdesc serialVersionUID =  
2439208141545036836, local class serialVersionUID = 4657685702603429489  
java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;  
local class incompatible: stream classdesc serialVersionUID =  
2439208141545036836, local class serialVersionUID = 4657685702603429489  
at  
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)  
at  
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)  
at  
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)  
at  
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)  
at  
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
at  
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)  
at  
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)  
at  
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)  
at  
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)  
at  
akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)  
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)  
at  
akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)  
at  
akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)  
at scala.util.Try$.apply(Try.scala:161)  
at  
akka.serialization.Serialization.deserialize(Serialization.scala:98)  
at  
akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)  
at  
akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)  
at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)  
at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)  
at  
akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)  
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)  
at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)  
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)  
at akka.actor.ActorCell.invoke(ActorCell.scala:487)  
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)  
at akka.dispatch.Mailbox.run(Mailbox.scala:220)  
at  
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)  
at  
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)  
at  
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)  
at  
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)  
at  
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)  
14/10/25 09:12:31 ERROR SparkDeploySchedulerBackend: Asked to remove non  
existant executor 3  
14/10/25 09:13:04 ERROR Remoting: org.apache.spark.storage.BlockManagerId;  
local class incompatible: stream classdesc serialVersionUID =  
2439208141545036836, local class serialVersionUID = 4657685702603429489  
java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId;  
local class incompatible: stream classdesc serialVersionUID =  
2439208141545036836, local class serialVersionUID = 4657685702603429489  
at  
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)  
at  
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)  
at  
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)  
at  
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)  
at  
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
at  
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)  
at  
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)  
at  
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)  
at  
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)  
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)  
at  
akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)  
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)  
at  
akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)  
at  
akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)  
at scala.util.Try$.apply(Try.scala:161)  
at  
akka.serialization.Serialization.deserialize(Serialization.scala:98)  
at  
akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)  
at  
akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)  
at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)  
at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)  
at  
akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)  
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)  
at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)  
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)  
at akka.actor.ActorCell.invoke(ActorCell.scala:487)  
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)  
at akka.dispatch.Mailbox.run(Mailbox.scala:220)  
at  
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)  
at  
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)  
at  
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)  
at  
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)  
at  
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)  
14/10/25 09:13:04 ERROR SparkDeploySchedulerBackend: Asked to remove non  
existant executor 3  
14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non  
existant executor 3  
14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non  
existant executor 3  
14/10/25 09:13:37 INFO AppClient$ClientActor: Executor updated:  
app-20141025091012-0002/3 is now EXITED (Command exited with code 1)  
14/10/25 09:13:37 INFO SparkDeploySchedulerBackend: Executor  
app-20141025091012-0002/3 removed: Command exited with code 1  
14/10/25 09:13:37 ERROR SparkDeploySchedulerBackend: Asked to remove non  
existant executor 3  
14/10/25 09:13:37 INFO AppClient$ClientActor: Executor added:  
app-20141025091012-0002/4 on worker-20141025170311-DEV-02.SpringB.GZ-35162  
(DEV-02.SpringB.GZ:35162) with 2 cores  
14/10/25 09:13:37 INFO SparkDeploySchedulerBackend: Granted executor ID  
app-20141025091012-0002/4 on hostPort DEV-02.SpringB.GZ:35162 with 2 cores,  
512.0 MB RAM  
14/10/25 09:13:37 INFO AppClient$ClientActor: Executor updated:  
app-20141025091012-0002/4 is now LOADING  
14/10/25 09:13:38 INFO AppClient$ClientActor: Executor updated:  
app-20141025091012-0002/4 is now RUNNING  
14/10/25 09:13:40 INFO SparkDeploySchedulerBackend: Registered executor:  
Actor[akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:56019/user/Executor#1354626597]  
with ID 4  
14/10/25 09:13:40 WARN ReliableDeliverySupervisor: Association with remote  
system [akka.tcp://sparkExecutor@DEV-02.SpringB.GZ:56019] has failed,  
address is now gated for [5000] ms. Reason is:  
[org.apache.spark.storage.BlockManagerId; local class incompatible: stream  
classdesc serialVersionUID = 2439208141545036836, local class  
serialVersionUID = 4657685702603429489].  
14/10/25 09:13:40 ERROR TaskSchedulerImpl: Lost executor 4 on  
DEV-02.SpringB.GZ: remote Akka client disassociated  
14/10/25 09:13:40 INFO DAGScheduler: Executor lost: 4 (epoch 3)  
14/10/25 09:13:40 INFO BlockManagerMasterActor: Trying to remove executor 4  
from BlockManagerMaster.  
14/10/25 09:13:40 INFO BlockManagerMaster: Removed 4 successfully in  
removeExecutor