You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "liuzhaokun (JIRA)" <ji...@apache.org> on 2017/06/22 06:52:00 UTC
[jira] [Commented] (SPARK-21169) Spark HA: Jobs state is in WAITING
status after reconnecting to standby master
[ https://issues.apache.org/jira/browse/SPARK-21169?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16058863#comment-16058863 ]
liuzhaokun commented on SPARK-21169:
------------------------------------
Is there any logerror information? Could you share your driver's log?
> Spark HA: Jobs state is in WAITING status after reconnecting to standby master
> ------------------------------------------------------------------------------
>
> Key: SPARK-21169
> URL: https://issues.apache.org/jira/browse/SPARK-21169
> Project: Spark
> Issue Type: Bug
> Components: Web UI
> Affects Versions: 2.1.0
> Reporter: Srinivasarao Daruna
>
> I have created spark cluster with
> 2 spark masters, and a separate zookeeper cluster.
> Configured following things.
> SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=zk_machine1:2181,zk_machine2:2181 -Dspark.deploy.zookeeper.dir=/secondlook/spark-ha"
> spark.master configuration in spark-defaults looks as below.
> spark://spark_master1:7077,spark_master2:7077
> 1) Submitted a spark streaming job with spark master configuration. Job got the resources and moved to RUNNING state. Job running in client mode.
> 2) Killed spark master 1, which is active at the time of start
> 3) Workers shifted to STANDBY master, and Stand By master became ACTIVE.
> 4) Running job moved to new spark master UI as well.
> However, the application state is appearing in WAITING status, instead of RUNNING status.
> Executors of the application are in RUNNING status.
> It looks like, the spark application is not updating after stand by master connection.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org