You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Yiannis Gkoufas <jo...@gmail.com> on 2016/01/05 15:29:44 UTC

Networking problems in Spark 1.6.0

Hi there,

I have been using Spark 1.5.2 on my cluster without a problem and wanted to
try Spark 1.6.0.
I have the exact same configuration on both clusters.
I am able to start the Standalone Cluster but I fail to submit a job
getting errors like the following:

16/01/05 14:24:14 INFO AppClient$ClientEndpoint: Connecting to master
spark://my-ip:7077...
16/01/05 14:24:34 INFO AppClient$ClientEndpoint: Connecting to master
spark://my-ip:7077...
16/01/05 14:24:54 INFO AppClient$ClientEndpoint: Connecting to master
spark://my-ip:7077...
16/01/05 14:24:54 INFO AppClient$ClientEndpoint: Connecting to master
spark://my-ip:7077...
16/01/05 14:24:54 WARN TransportChannelHandler: Exception in connection
from my-ip/X.XXX.XX.XX:7077
java.lang.NoSuchMethodError:
java.util.concurrent.ConcurrentHashMap.keySet()Ljava/util/concurrent/ConcurrentHashMap$KeySetView;
at org.apache.spark.rpc.netty.Dispatcher.postToAll(Dispatcher.scala:106)
at
org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:586)
at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:577)
at
org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:170)
at
org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:104)
at
org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)
at
org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
at
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at
io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at
org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
at
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
at
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
at
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
at
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)

Has anyone else had similar problems?

Thanks a lot

Re: Networking problems in Spark 1.6.0

Posted by Yiannis Gkoufas <jo...@gmail.com>.
Yes, that was the case, the app was built with java 8.
But that was the case with Spark 1.5.2 as well and it didn't complain.

On 5 January 2016 at 16:40, Dean Wampler <de...@gmail.com> wrote:

> ​Still, it would be good to know what happened exactly. Why did the netty
> dependency expect Java 8?  Did you build your app on a machine with Java 8
> and deploy on a Java 7 machine?​
>
> Anyway, I played with the 1.6.0 spark-shell using Java 7 and it worked
> fine. I also looked at the distribution's class files using e.g.,
>
> $ cd $HOME/spark/spark-1.6.0-bin-hadoop2.6
> ​$
>  jar xf lib/spark-assembly-1.6.0-hadoop2.6.0.jar
> org/apache/spark/rpc/netty/Dispatcher.class
> $ javap -classpath . -verbose org.apache.spark.rpc.netty.Dispatcher | grep
> version
>   minor version: 0
>   major version: 50
>
> So, it was compiled with Java 6 (see
> https://en.wikipedia.org/wiki/Java_class_file). So, it doesn't appear to
> be a Spark build issue.
>
> dean
>
> Dean Wampler, Ph.D.
> Author: Programming Scala, 2nd Edition
> <http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
> Typesafe <http://typesafe.com>
> @deanwampler <http://twitter.com/deanwampler>
> http://polyglotprogramming.com
>
> On Tue, Jan 5, 2016 at 9:01 AM, Yiannis Gkoufas <jo...@gmail.com>
> wrote:
>
>> Hi Dean,
>>
>> thanks so much for the response! It works without a problem now!
>>
>> On 5 January 2016 at 14:33, Dean Wampler <de...@gmail.com> wrote:
>>
>>> ConcurrentHashMap.keySet() returning a KeySetView is a Java 8 method.
>>> The Java 7 method returns a Set. Are you running Java 7? What happens if
>>> you run Java 8?
>>>
>>> Dean Wampler, Ph.D.
>>> Author: Programming Scala, 2nd Edition
>>> <http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
>>> Typesafe <http://typesafe.com>
>>> @deanwampler <http://twitter.com/deanwampler>
>>> http://polyglotprogramming.com
>>>
>>> On Tue, Jan 5, 2016 at 8:29 AM, Yiannis Gkoufas <jo...@gmail.com>
>>> wrote:
>>>
>>>> Hi there,
>>>>
>>>> I have been using Spark 1.5.2 on my cluster without a problem and
>>>> wanted to try Spark 1.6.0.
>>>> I have the exact same configuration on both clusters.
>>>> I am able to start the Standalone Cluster but I fail to submit a job
>>>> getting errors like the following:
>>>>
>>>> 16/01/05 14:24:14 INFO AppClient$ClientEndpoint: Connecting to master
>>>> spark://my-ip:7077...
>>>> 16/01/05 14:24:34 INFO AppClient$ClientEndpoint: Connecting to master
>>>> spark://my-ip:7077...
>>>> 16/01/05 14:24:54 INFO AppClient$ClientEndpoint: Connecting to master
>>>> spark://my-ip:7077...
>>>> 16/01/05 14:24:54 INFO AppClient$ClientEndpoint: Connecting to master
>>>> spark://my-ip:7077...
>>>> 16/01/05 14:24:54 WARN TransportChannelHandler: Exception in connection
>>>> from my-ip/X.XXX.XX.XX:7077
>>>> java.lang.NoSuchMethodError:
>>>> java.util.concurrent.ConcurrentHashMap.keySet()Ljava/util/concurrent/ConcurrentHashMap$KeySetView;
>>>> at org.apache.spark.rpc.netty.Dispatcher.postToAll(Dispatcher.scala:106)
>>>> at
>>>> org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:586)
>>>> at
>>>> org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:577)
>>>> at
>>>> org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:170)
>>>> at
>>>> org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:104)
>>>> at
>>>> org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)
>>>> at
>>>> org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
>>>> at
>>>> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
>>>> at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>>>> at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>>>> at
>>>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
>>>> at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>>>> at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>>>> at
>>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
>>>> at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>>>> at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>>>> at
>>>> org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
>>>> at
>>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>>>> at
>>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>>>> at
>>>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
>>>> at
>>>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>>>> at
>>>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
>>>> at
>>>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
>>>> at
>>>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
>>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
>>>> at
>>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Has anyone else had similar problems?
>>>>
>>>> Thanks a lot
>>>>
>>>>
>>>
>>
>

Re: Networking problems in Spark 1.6.0

Posted by Dean Wampler <de...@gmail.com>.
​Still, it would be good to know what happened exactly. Why did the netty
dependency expect Java 8?  Did you build your app on a machine with Java 8
and deploy on a Java 7 machine?​

Anyway, I played with the 1.6.0 spark-shell using Java 7 and it worked
fine. I also looked at the distribution's class files using e.g.,

$ cd $HOME/spark/spark-1.6.0-bin-hadoop2.6
​$
 jar xf lib/spark-assembly-1.6.0-hadoop2.6.0.jar
org/apache/spark/rpc/netty/Dispatcher.class
$ javap -classpath . -verbose org.apache.spark.rpc.netty.Dispatcher | grep
version
  minor version: 0
  major version: 50

So, it was compiled with Java 6 (see
https://en.wikipedia.org/wiki/Java_class_file). So, it doesn't appear to be
a Spark build issue.

dean

Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
Typesafe <http://typesafe.com>
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com

On Tue, Jan 5, 2016 at 9:01 AM, Yiannis Gkoufas <jo...@gmail.com>
wrote:

> Hi Dean,
>
> thanks so much for the response! It works without a problem now!
>
> On 5 January 2016 at 14:33, Dean Wampler <de...@gmail.com> wrote:
>
>> ConcurrentHashMap.keySet() returning a KeySetView is a Java 8 method. The
>> Java 7 method returns a Set. Are you running Java 7? What happens if you
>> run Java 8?
>>
>> Dean Wampler, Ph.D.
>> Author: Programming Scala, 2nd Edition
>> <http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
>> Typesafe <http://typesafe.com>
>> @deanwampler <http://twitter.com/deanwampler>
>> http://polyglotprogramming.com
>>
>> On Tue, Jan 5, 2016 at 8:29 AM, Yiannis Gkoufas <jo...@gmail.com>
>> wrote:
>>
>>> Hi there,
>>>
>>> I have been using Spark 1.5.2 on my cluster without a problem and wanted
>>> to try Spark 1.6.0.
>>> I have the exact same configuration on both clusters.
>>> I am able to start the Standalone Cluster but I fail to submit a job
>>> getting errors like the following:
>>>
>>> 16/01/05 14:24:14 INFO AppClient$ClientEndpoint: Connecting to master
>>> spark://my-ip:7077...
>>> 16/01/05 14:24:34 INFO AppClient$ClientEndpoint: Connecting to master
>>> spark://my-ip:7077...
>>> 16/01/05 14:24:54 INFO AppClient$ClientEndpoint: Connecting to master
>>> spark://my-ip:7077...
>>> 16/01/05 14:24:54 INFO AppClient$ClientEndpoint: Connecting to master
>>> spark://my-ip:7077...
>>> 16/01/05 14:24:54 WARN TransportChannelHandler: Exception in connection
>>> from my-ip/X.XXX.XX.XX:7077
>>> java.lang.NoSuchMethodError:
>>> java.util.concurrent.ConcurrentHashMap.keySet()Ljava/util/concurrent/ConcurrentHashMap$KeySetView;
>>> at org.apache.spark.rpc.netty.Dispatcher.postToAll(Dispatcher.scala:106)
>>> at
>>> org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:586)
>>> at
>>> org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:577)
>>> at
>>> org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:170)
>>> at
>>> org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:104)
>>> at
>>> org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)
>>> at
>>> org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
>>> at
>>> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>>> at
>>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>>> at
>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>>> at
>>> org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>>> at
>>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
>>> at
>>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>>> at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
>>> at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
>>> at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
>>> at
>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> Has anyone else had similar problems?
>>>
>>> Thanks a lot
>>>
>>>
>>
>

Re: Networking problems in Spark 1.6.0

Posted by Yiannis Gkoufas <jo...@gmail.com>.
Hi Dean,

thanks so much for the response! It works without a problem now!

On 5 January 2016 at 14:33, Dean Wampler <de...@gmail.com> wrote:

> ConcurrentHashMap.keySet() returning a KeySetView is a Java 8 method. The
> Java 7 method returns a Set. Are you running Java 7? What happens if you
> run Java 8?
>
> Dean Wampler, Ph.D.
> Author: Programming Scala, 2nd Edition
> <http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
> Typesafe <http://typesafe.com>
> @deanwampler <http://twitter.com/deanwampler>
> http://polyglotprogramming.com
>
> On Tue, Jan 5, 2016 at 8:29 AM, Yiannis Gkoufas <jo...@gmail.com>
> wrote:
>
>> Hi there,
>>
>> I have been using Spark 1.5.2 on my cluster without a problem and wanted
>> to try Spark 1.6.0.
>> I have the exact same configuration on both clusters.
>> I am able to start the Standalone Cluster but I fail to submit a job
>> getting errors like the following:
>>
>> 16/01/05 14:24:14 INFO AppClient$ClientEndpoint: Connecting to master
>> spark://my-ip:7077...
>> 16/01/05 14:24:34 INFO AppClient$ClientEndpoint: Connecting to master
>> spark://my-ip:7077...
>> 16/01/05 14:24:54 INFO AppClient$ClientEndpoint: Connecting to master
>> spark://my-ip:7077...
>> 16/01/05 14:24:54 INFO AppClient$ClientEndpoint: Connecting to master
>> spark://my-ip:7077...
>> 16/01/05 14:24:54 WARN TransportChannelHandler: Exception in connection
>> from my-ip/X.XXX.XX.XX:7077
>> java.lang.NoSuchMethodError:
>> java.util.concurrent.ConcurrentHashMap.keySet()Ljava/util/concurrent/ConcurrentHashMap$KeySetView;
>> at org.apache.spark.rpc.netty.Dispatcher.postToAll(Dispatcher.scala:106)
>> at
>> org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:586)
>> at
>> org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:577)
>> at
>> org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:170)
>> at
>> org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:104)
>> at
>> org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)
>> at
>> org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
>> at
>> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>> at
>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>> at
>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>> at
>> org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
>> at
>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
>> at
>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
>> at
>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
>> at
>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
>> at
>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
>> at
>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Has anyone else had similar problems?
>>
>> Thanks a lot
>>
>>
>

Re: Networking problems in Spark 1.6.0

Posted by Dean Wampler <de...@gmail.com>.
ConcurrentHashMap.keySet() returning a KeySetView is a Java 8 method. The
Java 7 method returns a Set. Are you running Java 7? What happens if you
run Java 8?

Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
Typesafe <http://typesafe.com>
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com

On Tue, Jan 5, 2016 at 8:29 AM, Yiannis Gkoufas <jo...@gmail.com>
wrote:

> Hi there,
>
> I have been using Spark 1.5.2 on my cluster without a problem and wanted
> to try Spark 1.6.0.
> I have the exact same configuration on both clusters.
> I am able to start the Standalone Cluster but I fail to submit a job
> getting errors like the following:
>
> 16/01/05 14:24:14 INFO AppClient$ClientEndpoint: Connecting to master
> spark://my-ip:7077...
> 16/01/05 14:24:34 INFO AppClient$ClientEndpoint: Connecting to master
> spark://my-ip:7077...
> 16/01/05 14:24:54 INFO AppClient$ClientEndpoint: Connecting to master
> spark://my-ip:7077...
> 16/01/05 14:24:54 INFO AppClient$ClientEndpoint: Connecting to master
> spark://my-ip:7077...
> 16/01/05 14:24:54 WARN TransportChannelHandler: Exception in connection
> from my-ip/X.XXX.XX.XX:7077
> java.lang.NoSuchMethodError:
> java.util.concurrent.ConcurrentHashMap.keySet()Ljava/util/concurrent/ConcurrentHashMap$KeySetView;
> at org.apache.spark.rpc.netty.Dispatcher.postToAll(Dispatcher.scala:106)
> at
> org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:586)
> at
> org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:577)
> at
> org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:170)
> at
> org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:104)
> at
> org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)
> at
> org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
> at
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
> at
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
> at
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
> at
> org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
> at
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
> at
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
> at
> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
> at java.lang.Thread.run(Thread.java:745)
>
> Has anyone else had similar problems?
>
> Thanks a lot
>
>