You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jesper Lundgren (JIRA)" <ji...@apache.org> on 2015/07/09 06:44:04 UTC
[jira] [Updated] (SPARK-8941) Standalone cluster worker does not
accept multiple masters on launch
[ https://issues.apache.org/jira/browse/SPARK-8941?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Jesper Lundgren updated SPARK-8941:
-----------------------------------
Description:
Before 1.4 it was possible to launch a worker node using a comma separated list of master nodes.
ex:
sbin/start-slave.sh 1 "spark://localhost:7077,localhost:7078"
starting org.apache.spark.deploy.worker.Worker, logging to /Users/jesper/Downloads/spark-1.4.0-bin-cdh4/sbin/../logs/spark-jesper-org.apache.spark.deploy.worker.Worker-1-Jespers-MacBook-Air.local.out
failed to launch org.apache.spark.deploy.worker.Worker:
Default is conf/spark-defaults.conf.
15/07/09 12:33:06 INFO Utils: Shutdown hook called
Spark 1.2 and 1.3.1 accepts multiple masters in this format.
was:
Before 1.4 it was possible to launch a worker node using a comma separated list of master nodes.
ex:
sbin/start-slave.sh 1 "spark://localhost:7077,localhost:7078"
starting org.apache.spark.deploy.worker.Worker, logging to /Users/jesper/Downloads/spark-1.4.0-bin-cdh4/sbin/../logs/spark-jesper-org.apache.spark.deploy.worker.Worker-1-Jespers-MacBook-Air.local.out
failed to launch org.apache.spark.deploy.worker.Worker:
Default is conf/spark-defaults.conf.
15/07/09 12:33:06 INFO Utils: Shutdown hook called
I tried in spark 1.2 and spark 1.3.1 those works so seems to an issue introduced with 1.4.
> Standalone cluster worker does not accept multiple masters on launch
> --------------------------------------------------------------------
>
> Key: SPARK-8941
> URL: https://issues.apache.org/jira/browse/SPARK-8941
> Project: Spark
> Issue Type: Bug
> Components: Deploy
> Affects Versions: 1.4.0
> Reporter: Jesper Lundgren
>
> Before 1.4 it was possible to launch a worker node using a comma separated list of master nodes.
> ex:
> sbin/start-slave.sh 1 "spark://localhost:7077,localhost:7078"
> starting org.apache.spark.deploy.worker.Worker, logging to /Users/jesper/Downloads/spark-1.4.0-bin-cdh4/sbin/../logs/spark-jesper-org.apache.spark.deploy.worker.Worker-1-Jespers-MacBook-Air.local.out
> failed to launch org.apache.spark.deploy.worker.Worker:
> Default is conf/spark-defaults.conf.
> 15/07/09 12:33:06 INFO Utils: Shutdown hook called
> Spark 1.2 and 1.3.1 accepts multiple masters in this format.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org