You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Suman Somasundar <su...@oracle.com> on 2017/10/09 22:42:37 UTC

UnresolvedAddressException in Kubernetes Cluster

Hi,

I am trying to deploy a Spark app in a Kubernetes Cluster. The cluster consists of 2 machines - 1 master and 1 slave, each of them with the following config:
RHEL 7.2
Docker 17.03.1
K8S 1.7.

I am following the steps provided in https://apache-spark-on-k8s.github.io/userdocs/running-on-kubernetes.html <https://apache-spark-on-k8s.github.io/userdocs/running-on-kubernetes.html>

When I submit an application (SparkPi), a driver pod is created on the slave machine of the cluster. But it exits with an exception:

2017-10-09 22:13:24 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
2017-10-09 22:13:30 ERROR SparkContext:91 - Error initializing SparkContext.
java.nio.channels.UnresolvedAddressException
	at sun.nio.ch.Net.checkAddress(Net.java:101)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:218)
	at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
	at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
	at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
	at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
	at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
	at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
	at java.lang.Thread.run(Thread.java:748)
2017-10-09 22:13:30 INFO  SparkContext:54 - Successfully stopped SparkContext

Has anyone come across this problem or know why this might be happening?

Thanks,
Suman.

Re: UnresolvedAddressException in Kubernetes Cluster

Posted by Matt Cheah <mc...@palantir.com>.
Hi there,



This closely resembles https://github.com/apache-spark-on-k8s/spark/issues/523, and we’re having some discussion there to find some possible root causes. However, what release of the fork are you working off of? Are you using the HEAD of branch-2.2-kubernetes, or something else?



-Matt Cheah

________________________________
From: Suman Somasundar <su...@oracle.com>
Sent: Monday, October 9, 2017 3:42:37 PM
To: user@spark.apache.org
Subject: UnresolvedAddressException in Kubernetes Cluster

Hi,

I am trying to deploy a Spark app in a Kubernetes Cluster. The cluster consists of 2 machines - 1 master and 1 slave, each of them with the following config:
RHEL 7.2
Docker 17.03.1
K8S 1.7.

I am following the steps provided in https://apache-spark-on-k8s.github.io/userdocs/running-on-kubernetes.html[apache-spark-on-k8s.github.io]<https://urldefense.proofpoint.com/v2/url?u=https-3A__apache-2Dspark-2Don-2Dk8s.github.io_userdocs_running-2Don-2Dkubernetes.html&d=DwMFAg&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=hzwIMNQ9E99EMYGuqHI0kXhVbvX3nU3OSDadUnJxjAs&m=zlu5rpKoNz_UVPQO7fiOZsC4zDcywhTcymQFzm4fwtI&s=oc2RZklavV0xU4gLilQBN3DocwjpwddYI3TNgL7CBhk&e=>

When I submit an application (SparkPi), a driver pod is created on the slave machine of the cluster. But it exits with an exception:

2017-10-09 22:13:24 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
2017-10-09 22:13:30 ERROR SparkContext:91 - Error initializing SparkContext.
java.nio.channels.UnresolvedAddressException
at sun.nio.ch.Net.checkAddress(Net.java:101)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:218)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:748)
2017-10-09 22:13:30 INFO  SparkContext:54 - Successfully stopped SparkContext

Has anyone come across this problem or know why this might be happening?

Thanks,
Suman.