You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:20:04 UTC
[jira] [Updated] (SPARK-9438) restarting leader zookeeper causes
spark master to die when the spark master election is assigned to zookeeper
[ https://issues.apache.org/jira/browse/SPARK-9438?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon updated SPARK-9438:
--------------------------------
Labels: bulk-closed (was: )
> restarting leader zookeeper causes spark master to die when the spark master election is assigned to zookeeper
> --------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-9438
> URL: https://issues.apache.org/jira/browse/SPARK-9438
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.2.0
> Environment: Saprk 1.2.0 and Zookeeper version: 3.4.6-1569965
> Reporter: Amir Rad
> Priority: Major
> Labels: bulk-closed
>
> When Spark Master Election is assigned to Zookeeper, restarting the leader Zookeeper causes the master spark to die.
> Steps to reproduce:
> create a cluster of 3 spark nodes.
> set Spark-env to:
> SPARK_LOCAL_DIRS="/home/sparkcde/data_spark/data"
> SPARK_MASTER_OPTS="-Dspark.deploy.spreadOut=false"
> SPARK_WORKER_DIR="/home/sparkcde/data_spark/worker"
> SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true"
> SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER -Dspark.deploy.zookeeper.url=s1:2181,s2:2181,s3:2181"
> Identify the spark master
> identify the zookeeper leader.
> Stop zookeeper leader
> check spark master: It is dead
> start zookeeper leader
> check spark master: still dead
> If you continue the same pattern of stopping and starting zookeeper leader, eventually you will lose the whole spark cluster.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org