You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by moon soo Lee <mo...@nflabs.com> on 2014/09/01 10:46:45 UTC
Spark driver application can not connect to Spark-Master
Hi, I'm developing an application with Spark.
My java application trying to creates spark context like
-------- Creating spark context ------------
public SparkContext createSparkContext(){
String execUri = System.getenv("SPARK_EXECUTOR_URI");
String[] jars = SparkILoop.getAddedJars();
SparkConf conf = new SparkConf().setMaster(getMaster())
.setAppName("App name").setJars(jars)
.set("spark.repl.class.uri", interpreter.intp().classServer().uri());
if (execUri != null) {
conf.set("spark.executor.uri", execUri);
}
if (System.getenv("SPARK_HOME") != null) {
conf.setSparkHome(System.getenv("SPARK_HOME"));
}
SparkContext sparkContext = new SparkContext(conf);
return sparkContext;
}
public String getMaster() {
String envMaster = System.getenv().get("MASTER");
if(envMaster!=null) return envMaster;
String propMaster = System.getProperty("spark.master");
if(propMaster!=null) return propMaster;
return "local[*]";
}
But when i call createSparkContext(), in driver side, i got logs like
---------- My application's log -----------------
INFO [2014-09-01 17:28:37,092] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - Changing view acls to: root
INFO [2014-09-01 17:28:37,092] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - SecurityManager: authentication disabled; ui
acls disabled; users with view permissions: Set(root)
INFO [2014-09-01 17:28:37,093] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - Starting HTTP Server
INFO [2014-09-01 17:28:37,096] ({pool-1-thread-2}
Server.java[doStart]:272) - jetty-8.1.14.v20131031
INFO [2014-09-01 17:28:37,099] ({pool-1-thread-2}
AbstractConnector.java[doStart]:338) - Started SocketConnector@0.0.0.0:46610
INFO [2014-09-01 17:28:40,050] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - Changing view acls to: root
INFO [2014-09-01 17:28:40,050] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - SecurityManager: authentication disabled; ui
acls disabled; users with view permissions: Set(root)
INFO [2014-09-01 17:28:40,589] ({spark-akka.actor.default-dispatcher-2}
Slf4jLogger.scala[applyOrElse]:80) - Slf4jLogger started
INFO [2014-09-01 17:28:40,626] ({spark-akka.actor.default-dispatcher-2}
Slf4jLogger.scala[apply$mcV$sp]:74) - Starting remoting
INFO [2014-09-01 17:28:40,833] ({spark-akka.actor.default-dispatcher-3}
Slf4jLogger.scala[apply$mcV$sp]:74) - Remoting started; listening on
addresses :[akka.tcp://spark@222.122.122.122:46833]
INFO [2014-09-01 17:28:40,835] ({spark-akka.actor.default-dispatcher-4}
Slf4jLogger.scala[apply$mcV$sp]:74) - Remoting now listens on addresses:
[akka.tcp://spark@222.122.122.122:46833]
INFO [2014-09-01 17:28:40,858] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - Registering MapOutputTracker
INFO [2014-09-01 17:28:40,861] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - Registering BlockManagerMaster
INFO [2014-09-01 17:28:40,877] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - Created local directory at
/tmp/spark-local-20140901172840-baf4
INFO [2014-09-01 17:28:40,881] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - MemoryStore started with capacity 546.3 MB.
INFO [2014-09-01 17:28:40,912] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - Bound socket to port 42671 with id =
ConnectionManagerId(222.122.122.122,42671)
INFO [2014-09-01 17:28:40,917] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - Trying to register BlockManager
INFO [2014-09-01 17:28:40,920] ({spark-akka.actor.default-dispatcher-4}
Logging.scala[logInfo]:58) - Registering block manager 222.122.122.122:42671
with 546.3 MB RAM
INFO [2014-09-01 17:28:40,921] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - Registered BlockManager
INFO [2014-09-01 17:28:40,932] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - Starting HTTP Server
INFO [2014-09-01 17:28:40,933] ({pool-1-thread-2}
Server.java[doStart]:272) - jetty-8.1.14.v20131031
INFO [2014-09-01 17:28:40,935] ({pool-1-thread-2}
AbstractConnector.java[doStart]:338) - Started SocketConnector@0.0.0.0:52020
INFO [2014-09-01 17:28:40,936] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - Broadcast server started at
http://222.122.122.122:52020
INFO [2014-09-01 17:28:40,943] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - HTTP File server directory is
/tmp/spark-fc4cc226-c740-4cec-ad0f-6f88762d365c
INFO [2014-09-01 17:28:40,943] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - Starting HTTP Server
INFO [2014-09-01 17:28:40,944] ({pool-1-thread-2}
Server.java[doStart]:272) - jetty-8.1.14.v20131031
INFO [2014-09-01 17:28:40,946] ({pool-1-thread-2}
AbstractConnector.java[doStart]:338) - Started SocketConnector@0.0.0.0:59458
INFO [2014-09-01 17:28:41,167] ({pool-1-thread-2}
Server.java[doStart]:272) - jetty-8.1.14.v20131031
INFO [2014-09-01 17:28:41,177] ({pool-1-thread-2}
AbstractConnector.java[doStart]:338) - Started
SelectChannelConnector@0.0.0.0:4040
INFO [2014-09-01 17:28:41,180] ({pool-1-thread-2}
Logging.scala[logInfo]:58) - Started SparkUI at http://222.122.122.122:4040
INFO [2014-09-01 17:28:41,410] ({spark-akka.actor.default-dispatcher-3}
Logging.scala[logInfo]:58) - Connecting to master
spark://master.spark.com:7077...
INFO [2014-09-01 17:29:01,441] ({spark-akka.actor.default-dispatcher-4}
Logging.scala[logInfo]:58) - Connecting to master
spark://master.spark.com:7077...
INFO [2014-09-01 17:29:21,440] ({spark-akka.actor.default-dispatcher-5}
Logging.scala[logInfo]:58) - Connecting to master
spark://master.spark.com:7077...
ERROR [2014-09-01 17:29:41,441] ({spark-akka.actor.default-dispatcher-3}
Logging.scala[logError]:74) - Application has been killed. Reason: All
masters are unresponsive! Giving up.
ERROR [2014-09-01 17:29:41,441] ({spark-akka.actor.default-dispatcher-3}
Logging.scala[logError]:74) - Exiting due to error from cluster scheduler:
All masters are unresponsive! Giving up.
and spark master print logs like
--------- Spark master log -------------------
14/09/01 17:29:01 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
disassociated, removing it.
14/09/01 17:29:01 INFO LocalActorRef: Message
[akka.remote.transport.AssociationHandle$Disassociated] from
Actor[akka://sparkMaster/deadLetters] to
Actor[akka://sparkMaster/system/endpointManager/endpointWriter-akka.tcp%3A%2F%2Fspark%40222.122.122.122%3A46833-89/endpointReader-akka.tcp%3A%2F%2Fspark%40222.122.122.122%3A46833-0#1195166754]
was not delivered. [163] dead letters encountered. This logging can be
turned off or adjusted with configuration settings 'akka.log-dead-letters'
and 'akka.log-dead-letters-during-shutdown'.
14/09/01 17:29:01 INFO LocalActorRef: Message
[akka.remote.transport.AssociationHandle$Disassociated] from
Actor[akka://sparkMaster/deadLetters] to
Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40222.122.122.122%3A48188-183#-552318082]
was not delivered. [164] dead letters encountered. This logging can be
turned off or adjusted with configuration settings 'akka.log-dead-letters'
and 'akka.log-dead-letters-during-shutdown'.
14/09/01 17:29:21 ERROR Remoting:
java.io.OptionalDataException
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1370)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:366)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:366)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:366)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:366)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:366)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
at
akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
at scala.util.Try$.apply(Try.scala:161)
at akka.serialization.Serialization.deserialize(Serialization.scala:98)
at
akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:58)
at
akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
at scala.util.Try$.apply(Try.scala:161)
at akka.serialization.Serialization.deserialize(Serialization.scala:98)
at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
at
akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:55)
at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:55)
at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:73)
at
akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:764)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
at akka.actor.ActorCell.invoke(ActorCell.scala:456)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
14/09/01 17:29:21 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
disassociated, removing it.
14/09/01 17:29:21 INFO LocalActorRef: Message
[akka.remote.transport.AssociationHandle$Disassociated] from
Actor[akka://sparkMaster/deadLetters] to
Actor[akka://sparkMaster/system/endpointManager/endpointWriter-akka.tcp%3A%2F%2Fspark%40222.122.122.122%3A46833-90/endpointReader-akka.tcp%3A%2F%2Fspark%40222.122.122.122%3A46833-0#-1619151630]
was not delivered. [165] dead letters encountered. This logging can be
turned off or adjusted with configuration settings 'akka.log-dead-letters'
and 'akka.log-dead-letters-during-shutdown'.
14/09/01 17:29:21 INFO LocalActorRef: Message
[akka.remote.transport.AssociationHandle$Disassociated] from
Actor[akka://sparkMaster/deadLetters] to
Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40222.122.122.122%3A48189-184#387676530]
was not delivered. [166] dead letters encountered. This logging can be
turned off or adjusted with configuration settings 'akka.log-dead-letters'
and 'akka.log-dead-letters-during-shutdown'.
14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
disassociated, removing it.
14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
disassociated, removing it.
14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
disassociated, removing it.
14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
disassociated, removing it.
14/09/01 17:29:41 INFO LocalActorRef: Message
[akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from
Actor[akka://sparkMaster/deadLetters] to
Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40222.122.122.122%3A48190-185#772674183]
was not delivered. [167] dead letters encountered. This logging can be
turned off or adjusted with configuration settings 'akka.log-dead-letters'
and 'akka.log-dead-letters-during-shutdown'.
14/09/01 17:29:41 INFO LocalActorRef: Message
[akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from
Actor[akka://sparkMaster/deadLetters] to
Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2Fspark%40222.122.122.122%3A46833-182#293009171]
was not delivered. [168] dead letters encountered. This logging can be
turned off or adjusted with configuration settings 'akka.log-dead-letters'
and 'akka.log-dead-letters-during-shutdown'.
14/09/01 17:29:41 ERROR EndpointWriter: AssociationError [akka.tcp://
sparkMaster@master.spark.com:7077] -> [akka.tcp://
spark@222.122.122.122:46833]: Error [Association failed with [akka.tcp://
spark@222.122.122.122:46833]] [
akka.remote.EndpointAssociationException: Association failed with
[akka.tcp://spark@222.122.122.122:46833]
Caused by:
akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
Connection refused: /222.122.122.122:46833
]
14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
disassociated, removing it.
14/09/01 17:29:41 ERROR EndpointWriter: AssociationError [akka.tcp://
sparkMaster@master.spark.com:7077] -> [akka.tcp://
spark@222.122.122.122:46833]: Error [Association failed with [akka.tcp://
spark@222.122.122.122:46833]] [
akka.remote.EndpointAssociationException: Association failed with
[akka.tcp://spark@222.122.122.122:46833]
Caused by:
akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
Connection refused: /222.122.122.122:46833
]
14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
disassociated, removing it.
14/09/01 17:29:41 ERROR EndpointWriter: AssociationError [akka.tcp://
sparkMaster@master.spark.com:7077] -> [akka.tcp://
spark@222.122.122.122:46833]: Error [Association failed with [akka.tcp://
spark@222.122.122.122:46833]] [
akka.remote.EndpointAssociationException: Association failed with
[akka.tcp://spark@222.122.122.122:46833]
Caused by:
akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
Connection refused: /222.122.122.122:46833
]
14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
disassociated, removing it.
-------------
My application and spark master is running on the same node
and MASTER env variable set as spark://master.spark.com:7077.
I'm using spark 1.0.1
I verified my cluster works with spark-shell smoothy.
Can someone give me some clue, how i can fix the problem?
Thanks.
Re: Spark driver application can not connect to Spark-Master
Posted by niranda <ni...@wso2.com>.
Hi,
I had the same issue in my Java code while I was trying to connect to a
locally hosted spark server (using sbin/start-all.sh etc) using an IDE
(IntelliJ).
I packaged my app into a jar and used spark-submit (in bin/) and it worked!
Hope this helps
Rgds
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-driver-application-can-not-connect-to-Spark-Master-tp13226p13779.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
Re: Spark driver application can not connect to Spark-Master
Posted by Andrew Or <an...@databricks.com>.
Your Master is dead, and your application can't connect to it. Can you
verify whether it was your application that killed the Master (by checking
the Master logs before and after you submit your application)? Try
restarting your master (and workers) through `sbin/stop-all.sh` and
`sbin/start-all.sh` on the master node, and try again? Also verify that
your driver is connecting to the correct address, which should be equal to
the value at the top-left corner of the Master web UI (http://
<master-ip>:8080).
-Andrew
2014-09-01 1:46 GMT-07:00 moon soo Lee <mo...@nflabs.com>:
> Hi, I'm developing an application with Spark.
>
> My java application trying to creates spark context like
>
>
> -------- Creating spark context ------------
>
> public SparkContext createSparkContext(){
> String execUri = System.getenv("SPARK_EXECUTOR_URI");
> String[] jars = SparkILoop.getAddedJars();
> SparkConf conf = new SparkConf().setMaster(getMaster())
> .setAppName("App name").setJars(jars)
> .set("spark.repl.class.uri", interpreter.intp().classServer().uri());
> if (execUri != null) {
> conf.set("spark.executor.uri", execUri);
> }
> if (System.getenv("SPARK_HOME") != null) {
> conf.setSparkHome(System.getenv("SPARK_HOME"));
> }
> SparkContext sparkContext = new SparkContext(conf);
> return sparkContext;
> }
> public String getMaster() {
> String envMaster = System.getenv().get("MASTER");
> if(envMaster!=null) return envMaster;
> String propMaster = System.getProperty("spark.master");
> if(propMaster!=null) return propMaster;
> return "local[*]";
> }
>
>
> But when i call createSparkContext(), in driver side, i got logs like
>
>
> ---------- My application's log -----------------
> INFO [2014-09-01 17:28:37,092] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - Changing view acls to: root
> INFO [2014-09-01 17:28:37,092] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - SecurityManager: authentication disabled; ui
> acls disabled; users with view permissions: Set(root)
> INFO [2014-09-01 17:28:37,093] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - Starting HTTP Server
> INFO [2014-09-01 17:28:37,096] ({pool-1-thread-2}
> Server.java[doStart]:272) - jetty-8.1.14.v20131031
> INFO [2014-09-01 17:28:37,099] ({pool-1-thread-2}
> AbstractConnector.java[doStart]:338) - Started
> SocketConnector@0.0.0.0:46610
> INFO [2014-09-01 17:28:40,050] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - Changing view acls to: root
> INFO [2014-09-01 17:28:40,050] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - SecurityManager: authentication disabled; ui
> acls disabled; users with view permissions: Set(root)
> INFO [2014-09-01 17:28:40,589] ({spark-akka.actor.default-dispatcher-2}
> Slf4jLogger.scala[applyOrElse]:80) - Slf4jLogger started
> INFO [2014-09-01 17:28:40,626] ({spark-akka.actor.default-dispatcher-2}
> Slf4jLogger.scala[apply$mcV$sp]:74) - Starting remoting
> INFO [2014-09-01 17:28:40,833] ({spark-akka.actor.default-dispatcher-3}
> Slf4jLogger.scala[apply$mcV$sp]:74) - Remoting started; listening on
> addresses :[akka.tcp://spark@222.122.122.122:46833]
> INFO [2014-09-01 17:28:40,835] ({spark-akka.actor.default-dispatcher-4}
> Slf4jLogger.scala[apply$mcV$sp]:74) - Remoting now listens on addresses:
> [akka.tcp://spark@222.122.122.122:46833]
> INFO [2014-09-01 17:28:40,858] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - Registering MapOutputTracker
> INFO [2014-09-01 17:28:40,861] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - Registering BlockManagerMaster
> INFO [2014-09-01 17:28:40,877] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - Created local directory at
> /tmp/spark-local-20140901172840-baf4
> INFO [2014-09-01 17:28:40,881] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - MemoryStore started with capacity 546.3 MB.
> INFO [2014-09-01 17:28:40,912] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - Bound socket to port 42671 with id =
> ConnectionManagerId(222.122.122.122,42671)
> INFO [2014-09-01 17:28:40,917] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - Trying to register BlockManager
> INFO [2014-09-01 17:28:40,920] ({spark-akka.actor.default-dispatcher-4}
> Logging.scala[logInfo]:58) - Registering block manager
> 222.122.122.122:42671 with 546.3 MB RAM
> INFO [2014-09-01 17:28:40,921] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - Registered BlockManager
> INFO [2014-09-01 17:28:40,932] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - Starting HTTP Server
> INFO [2014-09-01 17:28:40,933] ({pool-1-thread-2}
> Server.java[doStart]:272) - jetty-8.1.14.v20131031
> INFO [2014-09-01 17:28:40,935] ({pool-1-thread-2}
> AbstractConnector.java[doStart]:338) - Started
> SocketConnector@0.0.0.0:52020
> INFO [2014-09-01 17:28:40,936] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - Broadcast server started at
> http://222.122.122.122:52020
> INFO [2014-09-01 17:28:40,943] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - HTTP File server directory is
> /tmp/spark-fc4cc226-c740-4cec-ad0f-6f88762d365c
> INFO [2014-09-01 17:28:40,943] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - Starting HTTP Server
> INFO [2014-09-01 17:28:40,944] ({pool-1-thread-2}
> Server.java[doStart]:272) - jetty-8.1.14.v20131031
> INFO [2014-09-01 17:28:40,946] ({pool-1-thread-2}
> AbstractConnector.java[doStart]:338) - Started
> SocketConnector@0.0.0.0:59458
> INFO [2014-09-01 17:28:41,167] ({pool-1-thread-2}
> Server.java[doStart]:272) - jetty-8.1.14.v20131031
> INFO [2014-09-01 17:28:41,177] ({pool-1-thread-2}
> AbstractConnector.java[doStart]:338) - Started
> SelectChannelConnector@0.0.0.0:4040
> INFO [2014-09-01 17:28:41,180] ({pool-1-thread-2}
> Logging.scala[logInfo]:58) - Started SparkUI at
> http://222.122.122.122:4040
> INFO [2014-09-01 17:28:41,410] ({spark-akka.actor.default-dispatcher-3}
> Logging.scala[logInfo]:58) - Connecting to master
> spark://master.spark.com:7077...
> INFO [2014-09-01 17:29:01,441] ({spark-akka.actor.default-dispatcher-4}
> Logging.scala[logInfo]:58) - Connecting to master
> spark://master.spark.com:7077...
> INFO [2014-09-01 17:29:21,440] ({spark-akka.actor.default-dispatcher-5}
> Logging.scala[logInfo]:58) - Connecting to master
> spark://master.spark.com:7077...
> ERROR [2014-09-01 17:29:41,441] ({spark-akka.actor.default-dispatcher-3}
> Logging.scala[logError]:74) - Application has been killed. Reason: All
> masters are unresponsive! Giving up.
> ERROR [2014-09-01 17:29:41,441] ({spark-akka.actor.default-dispatcher-3}
> Logging.scala[logError]:74) - Exiting due to error from cluster scheduler:
> All masters are unresponsive! Giving up.
>
>
> and spark master print logs like
>
> --------- Spark master log -------------------
> 14/09/01 17:29:01 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
> disassociated, removing it.
> 14/09/01 17:29:01 INFO LocalActorRef: Message
> [akka.remote.transport.AssociationHandle$Disassociated] from
> Actor[akka://sparkMaster/deadLetters] to
> Actor[akka://sparkMaster/system/endpointManager/endpointWriter-akka.tcp%3A%2F%2Fspark%40222.122.122.122%3A46833-89/endpointReader-akka.tcp%3A%2F%2Fspark%40222.122.122.122%3A46833-0#1195166754]
> was not delivered. [163] dead letters encountered. This logging can be
> turned off or adjusted with configuration settings 'akka.log-dead-letters'
> and 'akka.log-dead-letters-during-shutdown'.
> 14/09/01 17:29:01 INFO LocalActorRef: Message
> [akka.remote.transport.AssociationHandle$Disassociated] from
> Actor[akka://sparkMaster/deadLetters] to
> Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40222.122.122.122%3A48188-183#-552318082]
> was not delivered. [164] dead letters encountered. This logging can be
> turned off or adjusted with configuration settings 'akka.log-dead-letters'
> and 'akka.log-dead-letters-during-shutdown'.
> 14/09/01 17:29:21 ERROR Remoting:
> java.io.OptionalDataException
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1370)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:366)
> at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:366)
> at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:366)
> at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:366)
> at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at scala.collection.immutable.$colon$colon.readObject(List.scala:366)
> at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
> at
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
> at scala.util.Try$.apply(Try.scala:161)
> at akka.serialization.Serialization.deserialize(Serialization.scala:98)
> at
> akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:58)
> at
> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
> at scala.util.Try$.apply(Try.scala:161)
> at akka.serialization.Serialization.deserialize(Serialization.scala:98)
> at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
> at
> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:55)
> at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:55)
> at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:73)
> at
> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:764)
> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
> at akka.actor.ActorCell.invoke(ActorCell.scala:456)
> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
> at akka.dispatch.Mailbox.run(Mailbox.scala:219)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> 14/09/01 17:29:21 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
> disassociated, removing it.
> 14/09/01 17:29:21 INFO LocalActorRef: Message
> [akka.remote.transport.AssociationHandle$Disassociated] from
> Actor[akka://sparkMaster/deadLetters] to
> Actor[akka://sparkMaster/system/endpointManager/endpointWriter-akka.tcp%3A%2F%2Fspark%40222.122.122.122%3A46833-90/endpointReader-akka.tcp%3A%2F%2Fspark%40222.122.122.122%3A46833-0#-1619151630]
> was not delivered. [165] dead letters encountered. This logging can be
> turned off or adjusted with configuration settings 'akka.log-dead-letters'
> and 'akka.log-dead-letters-during-shutdown'.
> 14/09/01 17:29:21 INFO LocalActorRef: Message
> [akka.remote.transport.AssociationHandle$Disassociated] from
> Actor[akka://sparkMaster/deadLetters] to
> Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40222.122.122.122%3A48189-184#387676530]
> was not delivered. [166] dead letters encountered. This logging can be
> turned off or adjusted with configuration settings 'akka.log-dead-letters'
> and 'akka.log-dead-letters-during-shutdown'.
> 14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
> disassociated, removing it.
> 14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
> disassociated, removing it.
> 14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
> disassociated, removing it.
> 14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
> disassociated, removing it.
> 14/09/01 17:29:41 INFO LocalActorRef: Message
> [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from
> Actor[akka://sparkMaster/deadLetters] to
> Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40222.122.122.122%3A48190-185#772674183]
> was not delivered. [167] dead letters encountered. This logging can be
> turned off or adjusted with configuration settings 'akka.log-dead-letters'
> and 'akka.log-dead-letters-during-shutdown'.
> 14/09/01 17:29:41 INFO LocalActorRef: Message
> [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from
> Actor[akka://sparkMaster/deadLetters] to
> Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2Fspark%40222.122.122.122%3A46833-182#293009171]
> was not delivered. [168] dead letters encountered. This logging can be
> turned off or adjusted with configuration settings 'akka.log-dead-letters'
> and 'akka.log-dead-letters-during-shutdown'.
> 14/09/01 17:29:41 ERROR EndpointWriter: AssociationError [akka.tcp://
> sparkMaster@master.spark.com:7077] -> [akka.tcp://
> spark@222.122.122.122:46833]: Error [Association failed with [akka.tcp://
> spark@222.122.122.122:46833]] [
> akka.remote.EndpointAssociationException: Association failed with
> [akka.tcp://spark@222.122.122.122:46833]
> Caused by:
> akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
> Connection refused: /222.122.122.122:46833
> ]
> 14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
> disassociated, removing it.
> 14/09/01 17:29:41 ERROR EndpointWriter: AssociationError [akka.tcp://
> sparkMaster@master.spark.com:7077] -> [akka.tcp://
> spark@222.122.122.122:46833]: Error [Association failed with [akka.tcp://
> spark@222.122.122.122:46833]] [
> akka.remote.EndpointAssociationException: Association failed with
> [akka.tcp://spark@222.122.122.122:46833]
> Caused by:
> akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
> Connection refused: /222.122.122.122:46833
> ]
> 14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
> disassociated, removing it.
> 14/09/01 17:29:41 ERROR EndpointWriter: AssociationError [akka.tcp://
> sparkMaster@master.spark.com:7077] -> [akka.tcp://
> spark@222.122.122.122:46833]: Error [Association failed with [akka.tcp://
> spark@222.122.122.122:46833]] [
> akka.remote.EndpointAssociationException: Association failed with
> [akka.tcp://spark@222.122.122.122:46833]
> Caused by:
> akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
> Connection refused: /222.122.122.122:46833
> ]
> 14/09/01 17:29:41 INFO Master: akka.tcp://spark@222.122.122.122:46833 got
> disassociated, removing it.
>
> -------------
>
> My application and spark master is running on the same node
> and MASTER env variable set as spark://master.spark.com:7077.
> I'm using spark 1.0.1
> I verified my cluster works with spark-shell smoothy.
> Can someone give me some clue, how i can fix the problem?
>
> Thanks.
>
>
>
>
>