You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/06/16 20:42:00 UTC

[jira] [Commented] (SPARK-8395) spark-submit documentation is incorrect

    [ https://issues.apache.org/jira/browse/SPARK-8395?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14588551#comment-14588551 ] 

Sean Owen commented on SPARK-8395:
----------------------------------

I think that's right. This looks like a hold-over from when this might have been controlled by spark-daemon.sh. You can raise a PR for this.

> spark-submit documentation is incorrect
> ---------------------------------------
>
>                 Key: SPARK-8395
>                 URL: https://issues.apache.org/jira/browse/SPARK-8395
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation
>    Affects Versions: 1.4.0
>            Reporter: Dev Lakhani
>            Priority: Minor
>
> Using a fresh checkout of 1.4.0-bin-hadoop2.6
> if you run 
> ./start-slave.sh  1 spark://localhost:7077
> you get
> failed to launch org.apache.spark.deploy.worker.Worker:
>                              Default is conf/spark-defaults.conf.
>   15/06/16 13:11:08 INFO Utils: Shutdown hook called
> it seems the worker number is not being accepted  as desccribed here:
> https://spark.apache.org/docs/latest/spark-standalone.html
> The documentation says:
> ./sbin/start-slave.sh <worker#> <master-spark-URL>
> but the start.slave-sh script states:
> usage="Usage: start-slave.sh <spark-master-URL> where <spark-master-URL> is like spark://localhost:7077"
> I have checked for similar issues using :
> https://issues.apache.org/jira/browse/SPARK-6552?jql=text%20~%20%22start-slave%22
> and found nothing similar so am raising this as an issue.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org