You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/07/09 12:51:04 UTC

[jira] [Resolved] (SPARK-8941) Standalone cluster worker does not accept multiple masters on launch

     [ https://issues.apache.org/jira/browse/SPARK-8941?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-8941.
------------------------------
    Resolution: Duplicate

Yeah I remember that changed and I remember later updating the docs, which maybe hadn't reflected that change utnil recently

> Standalone cluster worker does not accept multiple masters on launch
> --------------------------------------------------------------------
>
>                 Key: SPARK-8941
>                 URL: https://issues.apache.org/jira/browse/SPARK-8941
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 1.4.0
>            Reporter: Jesper Lundgren
>            Priority: Trivial
>
> Before 1.4 it was possible to launch a worker node using a comma separated list of master nodes. 
> ex:
> sbin/start-slave.sh 1 "spark://localhost:7077,localhost:7078"
> starting org.apache.spark.deploy.worker.Worker, logging to /Users/jesper/Downloads/spark-1.4.0-bin-cdh4/sbin/../logs/spark-jesper-org.apache.spark.deploy.worker.Worker-1-Jespers-MacBook-Air.local.out
> failed to launch org.apache.spark.deploy.worker.Worker:
>                              Default is conf/spark-defaults.conf.
>   15/07/09 12:33:06 INFO Utils: Shutdown hook called
> Spark 1.2 and 1.3.1 accepts multiple masters in this format.
> update: start-slave.sh only expects master lists in 1.4 (no instance number)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org