You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Taeyun Kim <ta...@innowireless.com> on 2014/12/11 08:37:09 UTC

Error on JavaSparkContext.stop()

Hi,

 

When my spark program calls JavaSparkContext.stop(), the following errors
occur.

           

           14/12/11 16:24:19 INFO Main: sc.stop {

           14/12/11 16:24:20 ERROR ConnectionManager: Corresponding
SendingConnection to ConnectionManagerId(cluster02,38918) not found

           14/12/11 16:24:20 ERROR SendingConnection: Exception while
reading SendingConnection to ConnectionManagerId(cluster04,59659)

           java.nio.channels.ClosedChannelException

                     at
sun.nio.ch.SocketChannelImpl.ensureReadOpen(SocketChannelImpl.java:252)

                     at
sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:295)

                     at
org.apache.spark.network.SendingConnection.read(Connection.scala:390)

                     at
org.apache.spark.network.ConnectionManager$$anon$6.run(ConnectionManager.sca
la:205)

                     at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
45)

                     at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
15)

                     at java.lang.Thread.run(Thread.java:745)

           14/12/11 16:24:20 ERROR ConnectionManager: Corresponding
SendingConnection to ConnectionManagerId(cluster03,59821) not found

           14/12/11 16:24:20 ERROR ConnectionManager: Corresponding
SendingConnection to ConnectionManagerId(cluster02,38918) not found

           14/12/11 16:24:20 WARN ConnectionManager: All connections not
cleaned up

           14/12/11 16:24:20 INFO Main: sc.stop }

 

How can I fix this?

 

The configuration is as follows:

- Spark version is 1.1.1

- Client runs on Windows 7

- The cluster is Linux(CentOS 6.5).

- spark.master=yarn-client

- Since Spark has a problem submitting job from Windows to Linux, I applied
my patch to the Spark source code. (Please see
https://github.com/apache/spark/pull/899 )

 

Spark 1.0.0 did not have this problem.

 

Thanks.