You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sree Vaddi (JIRA)" <ji...@apache.org> on 2015/04/14 23:35:59 UTC

[jira] [Commented] (SPARK-967) start-slaves.sh uses local path from master on remote slave nodes

    [ https://issues.apache.org/jira/browse/SPARK-967?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14494938#comment-14494938 ] 

Sree Vaddi commented on SPARK-967:
----------------------------------

Initiated conversation on dev list.


> start-slaves.sh uses local path from master on remote slave nodes
> -----------------------------------------------------------------
>
>                 Key: SPARK-967
>                 URL: https://issues.apache.org/jira/browse/SPARK-967
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 0.8.0, 0.8.1, 0.9.0
>            Reporter: Evgeniy Tsvigun
>            Priority: Trivial
>              Labels: script, starter
>
> If a slave node has home path other than master, start-slave.sh fails to start a worker instance, for other nodes behaves as expected, in my case: 
>     $ ./bin/start-slaves.sh 
>     node05.dev.vega.ru: bash: line 0: cd: /usr/home/etsvigun/spark/bin/..: No such file or directory
>     node04.dev.vega.ru: org.apache.spark.deploy.worker.Worker running as process 4796. Stop it first.
>     node03.dev.vega.ru: org.apache.spark.deploy.worker.Worker running as process 61348. Stop it first.
> I don't mention /usr/home anywhere, the only environment variable I set is $SPARK_HOME, relative to $HOME on every node, which makes me think some script takes `pwd` on master and tries to use it on slaves. 
> Spark version: fb6875dd5c9334802580155464cef9ac4d4cc1f0
> OS:  FreeBSD 8.4



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org