You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jesper Lundgren (JIRA)" <ji...@apache.org> on 2015/07/09 08:08:04 UTC
[jira] [Comment Edited] (SPARK-8941) Standalone cluster worker does
not accept multiple masters on launch
[ https://issues.apache.org/jira/browse/SPARK-8941?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14619958#comment-14619958 ]
Jesper Lundgren edited comment on SPARK-8941 at 7/9/15 6:08 AM:
----------------------------------------------------------------
After some more testing, it seems like spark-slave.sh does not take instance number anymore. removing "1" and only give a list of masters works.
was (Author: koudelka):
After some more testing, it seems like spark-slave.sh does not take instance number anymore. removing "1" and only give a list of masters work.
> Standalone cluster worker does not accept multiple masters on launch
> --------------------------------------------------------------------
>
> Key: SPARK-8941
> URL: https://issues.apache.org/jira/browse/SPARK-8941
> Project: Spark
> Issue Type: Bug
> Components: Deploy
> Affects Versions: 1.4.0
> Reporter: Jesper Lundgren
> Priority: Trivial
>
> Before 1.4 it was possible to launch a worker node using a comma separated list of master nodes.
> ex:
> sbin/start-slave.sh 1 "spark://localhost:7077,localhost:7078"
> starting org.apache.spark.deploy.worker.Worker, logging to /Users/jesper/Downloads/spark-1.4.0-bin-cdh4/sbin/../logs/spark-jesper-org.apache.spark.deploy.worker.Worker-1-Jespers-MacBook-Air.local.out
> failed to launch org.apache.spark.deploy.worker.Worker:
> Default is conf/spark-defaults.conf.
> 15/07/09 12:33:06 INFO Utils: Shutdown hook called
> Spark 1.2 and 1.3.1 accepts multiple masters in this format.
> update: start-slave.sh only expects master lists in 1.4 (no instance number)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org