You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by kant kodali <ka...@gmail.com> on 2016/09/03 06:49:32 UTC

any idea what this error could be?

I am running spark in stand alone mode. I guess this error when I run my driver
program..I am using spark 2.0.0. any idea what this error could be?


                  Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties16/09/02 23:44:44 INFO SparkContext: Running Spark version 2.0.016/09/02 23:44:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable16/09/02 23:44:45 INFO SecurityManager: Changing view acls to: kantkodali16/09/02 23:44:45 INFO SecurityManager: Changing modify acls to: kantkodali16/09/02 23:44:45 INFO SecurityManager: Changing view acls groups to: 16/09/02 23:44:45 INFO SecurityManager: Changing modify acls groups to: 16/09/02 23:44:45 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(kantkodali); groups with view permissions: Set(); users  with modify permissions: Set(kantkodali); groups with modify permissions: Set()16/09/02 23:44:45 INFO Utils: Successfully started service 'sparkDriver' on port 62256.16/09/02 23:44:45 INFO SparkEnv: Registering MapOutputTracker16/09/02 23:44:45 INFO SparkEnv: Registering BlockManagerMaster16/09/02 23:44:45 INFO DiskBlockManager: Created local directory at /private/var/folders/_6/lfxt933j3bd_xhq0m7dwm8s00000gn/T/blockmgr-b56eea49-0102-4570-865a-1d3d230f0ffc16/09/02 23:44:45 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB16/09/02 23:44:45 INFO SparkEnv: Registering OutputCommitCoordinator16/09/02 23:44:45 INFO Utils: Successfully started service 'SparkUI' on port 4040.16/09/02 23:44:45 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.0.191:404016/09/02 23:44:45 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://52.43.37.223:7077...16/09/02 23:44:46 INFO TransportClientFactory: Successfully created connection to /52.43.37.223:7077 after 70 ms (0 ms spent in bootstraps)16/09/02 23:44:46 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 52.43.37.223:7077org.apache.spark.SparkException: Exception thrown in awaitResult    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)    at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)    at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)    at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109)    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)    at java.util.concurrent.FutureTask.run(FutureTask.java:266)    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)    at java.lang.Thread.run(Thread.java:745)Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -2221986757032131007, local class serialVersionUID = -5447855329526097695    at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)    at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:259)    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:308)    at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:258)    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:257)    at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:578)    at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:563)    at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:158)    at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:106)    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:119)    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)    at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)    at java.lang.Thread.run(Thread.java:745)    at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:190)    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:121)    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:307)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:293)    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:307)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:293)    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:307)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:293)    at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:307)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:293)    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:840)    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:112)    ... 1 more16/09/02 23:45:05 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://52.43.37.223:7077...16/09/02 23:45:06 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 52.43.37.223:7077org.apache.spark.SparkException: Exception thrown in awaitResult    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)    at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)    at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)    at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109)    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)    at java.util.concurrent.FutureTask.run(FutureTask.java:266)    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)    at java.lang.Thread.run(Thread.java:745)Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -2221986757032131007, local class serialVersionUID = -5447855329526097695    at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)    at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:259)    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:308)
                

Not using Mixmax yet?

Re: any idea what this error could be?

Posted by Fridtjof Sander <fr...@googlemail.com>.
I see. The default scala version changed to 2.11 with Spark 2.0.0 afaik, so that's probably the version you get when downloading prepackaged binaries. Glad I could help ;)

Am 3. September 2016 23:59:51 MESZ, schrieb kant kodali <ka...@gmail.com>:
>@Fridtjof you are right!
>changing it to this Fixed it!
>ompile group: org.apache.spark' name: 'spark-core_2.11' version:
>'2.0.0'
>compile group: 'org.apache.spark' name: 'spark-streaming_2.11' version:
>'2.0.0'
>
>
>
>
>
>
>
>On Sat, Sep 3, 2016 12:30 PM, kant kodali kanth909@gmail.com
>wrote:
>I increased the memory but nothing has changed I still get the same
>error.
>@Fridtjofon my driver side I am using the following dependenciescompile
>group:
>org.apache.spark' name: 'spark-core_2.10' version: '2.0.0'
>compile group: 'org.apache.spark' name: 'spark-streaming_2.10' version:
>'2.0.0'
>on the executor side I don't know what jars are being used but I have
>installed
>using this zip filespark-2.0.0-bin-hadoop2.7.tgz
> 
>
>
>
>
>
>On Sat, Sep 3, 2016 4:20 AM, Fridtjof Sander
>fridtjof.sander@googlemail.com
>wrote:
>There is an InvalidClassException complaining about non-matching
>serialVersionUIDs. Shouldn't that be caused by different jars on
>executors and
>driver?
>
>
>Am 03.09.2016 1:04 nachm. schrieb "Tal Grynbaum"
><ta...@gmail.com>:
>My guess is that you're running out of memory somewhere.� Try to
>increase the
>driver memory and/or executor memory.�� 
>
>
>On Sat, Sep 3, 2016, 11:42 kant kodali <ka...@gmail.com> wrote:
>I am running this on aws.
>
> 
>
>
>
>
>
>On Fri, Sep 2, 2016 11:49 PM, kant kodali kanth909@gmail.com
>wrote:
>I am running spark in stand alone mode. I guess this error when I run
>my driver
>program..I am using spark 2.0.0. any idea what this error could be?
>
>
>Using Spark's default log4j profile:
>org/apache/spark/log4j-defaults.properties16/09/02 23:44:44 INFO
>SparkContext: Running Spark version 2.0.016/09/02 23:44:44 WARN
>NativeCodeLoader: Unable to load native-hadoop library for your
>platform... using builtin-java classes where applicable16/09/02
>23:44:45 INFO SecurityManager: Changing view acls to:
>kantkodali16/09/02 23:44:45 INFO SecurityManager: Changing modify acls
>to: kantkodali16/09/02 23:44:45 INFO SecurityManager: Changing view
>acls groups to: 16/09/02 23:44:45 INFO SecurityManager: Changing modify
>acls groups to: 16/09/02 23:44:45 INFO SecurityManager:
>SecurityManager: authentication disabled; ui acls disabled; users  with
>view permissions: Set(kantkodali); groups with view permissions: Set();
>users  with modify permissions: Set(kantkodali); groups with modify
>permissions: Set()16/09/02 23:44:45 INFO Utils: Successfully started
>service 'sparkDriver' on port 62256.16/09/02 23:44:45 INFO SparkEnv:
>Registering MapOutputTracker16/09/02 23:44:45 INFO SparkEnv:
>Registering BlockManagerMaster16/09/02 23:44:45 INFO DiskBlockManager:
>Created local directory at
>/private/var/folders/_6/lfxt933j3bd_xhq0m7dwm8s00000gn/T/blockmgr-b56eea49-0102-4570-865a-1d3d230f0ffc16/09/02
>23:44:45 INFO MemoryStore: MemoryStore started with capacity 2004.6
>MB16/09/02 23:44:45 INFO SparkEnv: Registering
>OutputCommitCoordinator16/09/02 23:44:45 INFO Utils: Successfully
>started service 'SparkUI' on port 4040.16/09/02 23:44:45 INFO SparkUI:
>Bound SparkUI to 0.0.0.0, and started at
>http://192.168.0.191:404016/09/02 23:44:45 INFO
>StandaloneAppClient$ClientEndpoint: Connecting to master
>spark://52.43.37.223:7077...16/09/02 23:44:46 INFO
>TransportClientFactory: Successfully created connection to
>/52.43.37.223:7077 after 70 ms (0 ms spent in bootstraps)16/09/02
>23:44:46 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to
>master 52.43.37.223:7077org.apache.spark.SparkException: Exception
>thrown in awaitResult    at
>org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
>at
>org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
>at
>scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
>at
>org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
>at
>org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
>at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)    at
>org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)    at
>org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)   
>at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)    at
>org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109)
>at
>java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>at java.util.concurrent.FutureTask.run(FutureTask.java:266)    at
>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>at
>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>at java.lang.Thread.run(Thread.java:745)Caused by:
>java.lang.RuntimeException: java.io.InvalidClassException:
>org.apache.spark.rpc.netty.RequestMessage; local class incompatible:
>stream classdesc serialVersionUID = -2221986757032131007, local class
>serialVersionUID = -5447855329526097695    at
>java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)   
>at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:

Re: any idea what this error could be?

Posted by kant kodali <ka...@gmail.com>.
@Fridtjof you are right!
changing it to this Fixed it!
ompile group: org.apache.spark' name: 'spark-core_2.11' version: '2.0.0'
compile group: 'org.apache.spark' name: 'spark-streaming_2.11' version: '2.0.0'







On Sat, Sep 3, 2016 12:30 PM, kant kodali kanth909@gmail.com
wrote:
I increased the memory but nothing has changed I still get the same error.
@Fridtjofon my driver side I am using the following dependenciescompile group:
org.apache.spark' name: 'spark-core_2.10' version: '2.0.0'
compile group: 'org.apache.spark' name: 'spark-streaming_2.10' version: '2.0.0'
on the executor side I don't know what jars are being used but I have installed
using this zip filespark-2.0.0-bin-hadoop2.7.tgz
 





On Sat, Sep 3, 2016 4:20 AM, Fridtjof Sander fridtjof.sander@googlemail.com
wrote:
There is an InvalidClassException complaining about non-matching
serialVersionUIDs. Shouldn't that be caused by different jars on executors and
driver?


Am 03.09.2016 1:04 nachm. schrieb "Tal Grynbaum" <ta...@gmail.com>:
My guess is that you're running out of memory somewhere.  Try to increase the
driver memory and/or executor memory.   


On Sat, Sep 3, 2016, 11:42 kant kodali <ka...@gmail.com> wrote:
I am running this on aws.

 





On Fri, Sep 2, 2016 11:49 PM, kant kodali kanth909@gmail.com
wrote:
I am running spark in stand alone mode. I guess this error when I run my driver
program..I am using spark 2.0.0. any idea what this error could be?


                  Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties16/09/02 23:44:44 INFO SparkContext: Running Spark version 2.0.016/09/02 23:44:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable16/09/02 23:44:45 INFO SecurityManager: Changing view acls to: kantkodali16/09/02 23:44:45 INFO SecurityManager: Changing modify acls to: kantkodali16/09/02 23:44:45 INFO SecurityManager: Changing view acls groups to: 16/09/02 23:44:45 INFO SecurityManager: Changing modify acls groups to: 16/09/02 23:44:45 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(kantkodali); groups with view permissions: Set(); users  with modify permissions: Set(kantkodali); groups with modify permissions: Set()16/09/02 23:44:45 INFO Utils: Successfully started service 'sparkDriver' on port 62256.16/09/02 23:44:45 INFO SparkEnv: Registering MapOutputTracker16/09/02 23:44:45 INFO SparkEnv: Registering BlockManagerMaster16/09/02 23:44:45 INFO DiskBlockManager: Created local directory at /private/var/folders/_6/lfxt933j3bd_xhq0m7dwm8s00000gn/T/blockmgr-b56eea49-0102-4570-865a-1d3d230f0ffc16/09/02 23:44:45 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB16/09/02 23:44:45 INFO SparkEnv: Registering OutputCommitCoordinator16/09/02 23:44:45 INFO Utils: Successfully started service 'SparkUI' on port 4040.16/09/02 23:44:45 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.0.191:404016/09/02 23:44:45 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://52.43.37.223:7077...16/09/02 23:44:46 INFO TransportClientFactory: Successfully created connection to /52.43.37.223:7077 after 70 ms (0 ms spent in bootstraps)16/09/02 23:44:46 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 52.43.37.223:7077org.apache.spark.SparkException: Exception thrown in awaitResult    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)    at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)    at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)    at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109)    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)    at java.util.concurrent.FutureTask.run(FutureTask.java:266)    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)    at java.lang.Thread.run(Thread.java:745)Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -2221986757032131007, local class serialVersionUID = -5447855329526097695    at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:

Re: any idea what this error could be?

Posted by kant kodali <ka...@gmail.com>.
I increased the memory but nothing has changed I still get the same error.
@Fridtjofon my driver side I am using the following dependenciescompile group:
org.apache.spark' name: 'spark-core_2.10' version: '2.0.0'
compile group: 'org.apache.spark' name: 'spark-streaming_2.10' version: '2.0.0'
on the executor side I don't know what jars are being used but I have installed
using this zip filespark-2.0.0-bin-hadoop2.7.tgz
 





On Sat, Sep 3, 2016 4:20 AM, Fridtjof Sander fridtjof.sander@googlemail.com
wrote:
There is an InvalidClassException complaining about non-matching
serialVersionUIDs. Shouldn't that be caused by different jars on executors and
driver?


Am 03.09.2016 1:04 nachm. schrieb "Tal Grynbaum" <ta...@gmail.com>:
My guess is that you're running out of memory somewhere.  Try to increase the
driver memory and/or executor memory.   


On Sat, Sep 3, 2016, 11:42 kant kodali <ka...@gmail.com> wrote:
I am running this on aws.

 





On Fri, Sep 2, 2016 11:49 PM, kant kodali kanth909@gmail.com
wrote:
I am running spark in stand alone mode. I guess this error when I run my driver
program..I am using spark 2.0.0. any idea what this error could be?


                  Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties16/09/02 23:44:44 INFO SparkContext: Running Spark version 2.0.016/09/02 23:44:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable16/09/02 23:44:45 INFO SecurityManager: Changing view acls to: kantkodali16/09/02 23:44:45 INFO SecurityManager: Changing modify acls to: kantkodali16/09/02 23:44:45 INFO SecurityManager: Changing view acls groups to: 16/09/02 23:44:45 INFO SecurityManager: Changing modify acls groups to: 16/09/02 23:44:45 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(kantkodali); groups with view permissions: Set(); users  with modify permissions: Set(kantkodali); groups with modify permissions: Set()16/09/02 23:44:45 INFO Utils: Successfully started service 'sparkDriver' on port 62256.16/09/02 23:44:45 INFO SparkEnv: Registering MapOutputTracker16/09/02 23:44:45 INFO SparkEnv: Registering BlockManagerMaster16/09/02 23:44:45 INFO DiskBlockManager: Created local directory at /private/var/folders/_6/lfxt933j3bd_xhq0m7dwm8s00000gn/T/blockmgr-b56eea49-0102-4570-865a-1d3d230f0ffc16/09/02 23:44:45 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB16/09/02 23:44:45 INFO SparkEnv: Registering OutputCommitCoordinator16/09/02 23:44:45 INFO Utils: Successfully started service 'SparkUI' on port 4040.16/09/02 23:44:45 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.0.191:404016/09/02 23:44:45 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://52.43.37.223:7077...16/09/02 23:44:46 INFO TransportClientFactory: Successfully created connection to /52.43.37.223:7077 after 70 ms (0 ms spent in bootstraps)16/09/02 23:44:46 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 52.43.37.223:7077org.apache.spark.SparkException: Exception thrown in awaitResult    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)    at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)    at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)    at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109)    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)    at java.util.concurrent.FutureTask.run(FutureTask.java:266)    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)    at java.lang.Thread.run(Thread.java:745)Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -2221986757032131007, local class serialVersionUID = -5447855329526097695    at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:

Re: any idea what this error could be?

Posted by Fridtjof Sander <fr...@googlemail.com>.
There is an InvalidClassException complaining about non-matching
serialVersionUIDs. Shouldn't that be caused by different jars on executors
and driver?

Am 03.09.2016 1:04 nachm. schrieb "Tal Grynbaum" <ta...@gmail.com>:

> My guess is that you're running out of memory somewhere.  Try to increase
> the driver memory and/or executor memory.
>
> On Sat, Sep 3, 2016, 11:42 kant kodali <ka...@gmail.com> wrote:
>
>> I am running this on aws.
>>
>>
>>
>> On Fri, Sep 2, 2016 11:49 PM, kant kodali kanth909@gmail.com wrote:
>>
>>> I am running spark in stand alone mode. I guess this error when I run my
>>> driver program..I am using spark 2.0.0. any idea what this error could be?
>>>
>>>
>>>
>>>                   Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
>>> 16/09/02 23:44:44 INFO SparkContext: Running Spark version 2.0.0
>>> 16/09/02 23:44:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>>> 16/09/02 23:44:45 INFO SecurityManager: Changing view acls to: kantkodali
>>> 16/09/02 23:44:45 INFO SecurityManager: Changing modify acls to: kantkodali
>>> 16/09/02 23:44:45 INFO SecurityManager: Changing view acls groups to:
>>> 16/09/02 23:44:45 INFO SecurityManager: Changing modify acls groups to:
>>> 16/09/02 23:44:45 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(kantkodali); groups with view permissions: Set(); users  with modify permissions: Set(kantkodali); groups with modify permissions: Set()
>>> 16/09/02 23:44:45 INFO Utils: Successfully started service 'sparkDriver' on port 62256.
>>> 16/09/02 23:44:45 INFO SparkEnv: Registering MapOutputTracker
>>> 16/09/02 23:44:45 INFO SparkEnv: Registering BlockManagerMaster
>>> 16/09/02 23:44:45 INFO DiskBlockManager: Created local directory at /private/var/folders/_6/lfxt933j3bd_xhq0m7dwm8s00000gn/T/blockmgr-b56eea49-0102-4570-865a-1d3d230f0ffc
>>> 16/09/02 23:44:45 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB
>>> 16/09/02 23:44:45 INFO SparkEnv: Registering OutputCommitCoordinator
>>> 16/09/02 23:44:45 INFO Utils: Successfully started service 'SparkUI' on port 4040.
>>> 16/09/02 23:44:45 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.0.191:4040
>>> 16/09/02 23:44:45 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://52.43.37.223:7077...
>>> 16/09/02 23:44:46 INFO TransportClientFactory: Successfully created connection to /52.43.37.223:7077 after 70 ms (0 ms spent in bootstraps)
>>> 16/09/02 23:44:46 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 52.43.37.223:7077
>>> org.apache.spark.SparkException: Exception thrown in awaitResult
>>>     at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
>>>     at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
>>>     at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
>>>     at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
>>>     at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
>>>     at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)
>>>     at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
>>>     at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)
>>>     at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)
>>>     at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109)
>>>     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>>     at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>>     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>>     at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -2221986757032131007, local class serialVersionUID = -5447855329526097695
>>>     at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
>>>     at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:
>>>
>>>

Re: any idea what this error could be?

Posted by Tal Grynbaum <ta...@gmail.com>.
My guess is that you're running out of memory somewhere.  Try to increase
the driver memory and/or executor memory.

On Sat, Sep 3, 2016, 11:42 kant kodali <ka...@gmail.com> wrote:

> I am running this on aws.
>
>
>
> On Fri, Sep 2, 2016 11:49 PM, kant kodali kanth909@gmail.com wrote:
>
>> I am running spark in stand alone mode. I guess this error when I run my
>> driver program..I am using spark 2.0.0. any idea what this error could be?
>>
>>
>>
>>                   Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
>> 16/09/02 23:44:44 INFO SparkContext: Running Spark version 2.0.0
>> 16/09/02 23:44:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>> 16/09/02 23:44:45 INFO SecurityManager: Changing view acls to: kantkodali
>> 16/09/02 23:44:45 INFO SecurityManager: Changing modify acls to: kantkodali
>> 16/09/02 23:44:45 INFO SecurityManager: Changing view acls groups to:
>> 16/09/02 23:44:45 INFO SecurityManager: Changing modify acls groups to:
>> 16/09/02 23:44:45 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(kantkodali); groups with view permissions: Set(); users  with modify permissions: Set(kantkodali); groups with modify permissions: Set()
>> 16/09/02 23:44:45 INFO Utils: Successfully started service 'sparkDriver' on port 62256.
>> 16/09/02 23:44:45 INFO SparkEnv: Registering MapOutputTracker
>> 16/09/02 23:44:45 INFO SparkEnv: Registering BlockManagerMaster
>> 16/09/02 23:44:45 INFO DiskBlockManager: Created local directory at /private/var/folders/_6/lfxt933j3bd_xhq0m7dwm8s00000gn/T/blockmgr-b56eea49-0102-4570-865a-1d3d230f0ffc
>> 16/09/02 23:44:45 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB
>> 16/09/02 23:44:45 INFO SparkEnv: Registering OutputCommitCoordinator
>> 16/09/02 23:44:45 INFO Utils: Successfully started service 'SparkUI' on port 4040.
>> 16/09/02 23:44:45 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.0.191:4040
>> 16/09/02 23:44:45 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://52.43.37.223:7077...
>> 16/09/02 23:44:46 INFO TransportClientFactory: Successfully created connection to /52.43.37.223:7077 after 70 ms (0 ms spent in bootstraps)
>> 16/09/02 23:44:46 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 52.43.37.223:7077
>> org.apache.spark.SparkException: Exception thrown in awaitResult
>>     at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
>>     at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
>>     at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
>>     at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
>>     at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
>>     at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)
>>     at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
>>     at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)
>>     at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)
>>     at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109)
>>     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>     at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>     at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -2221986757032131007, local class serialVersionUID = -5447855329526097695
>>     at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)
>>     at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:
>>
>>

Re: any idea what this error could be?

Posted by kant kodali <ka...@gmail.com>.
I am running this on aws.
 





On Fri, Sep 2, 2016 11:49 PM, kant kodali kanth909@gmail.com
wrote:
I am running spark in stand alone mode. I guess this error when I run my driver
program..I am using spark 2.0.0. any idea what this error could be?


                  Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties16/09/02 23:44:44 INFO SparkContext: Running Spark version 2.0.016/09/02 23:44:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable16/09/02 23:44:45 INFO SecurityManager: Changing view acls to: kantkodali16/09/02 23:44:45 INFO SecurityManager: Changing modify acls to: kantkodali16/09/02 23:44:45 INFO SecurityManager: Changing view acls groups to: 16/09/02 23:44:45 INFO SecurityManager: Changing modify acls groups to: 16/09/02 23:44:45 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(kantkodali); groups with view permissions: Set(); users  with modify permissions: Set(kantkodali); groups with modify permissions: Set()16/09/02 23:44:45 INFO Utils: Successfully started service 'sparkDriver' on port 62256.16/09/02 23:44:45 INFO SparkEnv: Registering MapOutputTracker16/09/02 23:44:45 INFO SparkEnv: Registering BlockManagerMaster16/09/02 23:44:45 INFO DiskBlockManager: Created local directory at /private/var/folders/_6/lfxt933j3bd_xhq0m7dwm8s00000gn/T/blockmgr-b56eea49-0102-4570-865a-1d3d230f0ffc16/09/02 23:44:45 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB16/09/02 23:44:45 INFO SparkEnv: Registering OutputCommitCoordinator16/09/02 23:44:45 INFO Utils: Successfully started service 'SparkUI' on port 4040.16/09/02 23:44:45 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.0.191:404016/09/02 23:44:45 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://52.43.37.223:7077...16/09/02 23:44:46 INFO TransportClientFactory: Successfully created connection to /52.43.37.223:7077 after 70 ms (0 ms spent in bootstraps)16/09/02 23:44:46 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 52.43.37.223:7077org.apache.spark.SparkException: Exception thrown in awaitResult    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)    at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)    at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)    at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109)    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)    at java.util.concurrent.FutureTask.run(FutureTask.java:266)    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)    at java.lang.Thread.run(Thread.java:745)Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -2221986757032131007, local class serialVersionUID = -5447855329526097695    at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)    at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:259)    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:308)    at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:258)    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:257)    at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:578)    at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:563)    at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:158)    at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:106)    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:119)    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)    at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)    at java.lang.Thread.run(Thread.java:745)    at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:190)    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:121)    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:307)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:293)    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:307)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:293)    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:307)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:293)    at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:307)    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:293)    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:840)    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:112)    ... 1 more16/09/02 23:45:05 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://52.43.37.223:7077...16/09/02 23:45:06 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 52.43.37.223:7077org.apache.spark.SparkException: Exception thrown in awaitResult    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)    at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)    at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)    at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)    at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)    at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:88)    at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:96)    at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:109)    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)    at java.util.concurrent.FutureTask.run(FutureTask.java:266)    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)    at java.lang.Thread.run(Thread.java:745)Caused by: java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -2221986757032131007, local class serialVersionUID = -5447855329526097695    at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:616)    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)    at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:259)    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:308)
                

Not using Mixmax yet?