You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/06/13 15:30:00 UTC

[jira] [Commented] (SPARK-20989) Fail to start multiple workers on one host if external shuffle service is enabled in standalone mode

    [ https://issues.apache.org/jira/browse/SPARK-20989?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16048028#comment-16048028 ] 

Apache Spark commented on SPARK-20989:
--------------------------------------

User 'jiangxb1987' has created a pull request for this issue:
https://github.com/apache/spark/pull/18290

> Fail to start multiple workers on one host if external shuffle service is enabled in standalone mode
> ----------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-20989
>                 URL: https://issues.apache.org/jira/browse/SPARK-20989
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy, Spark Core
>    Affects Versions: 2.1.1
>            Reporter: Jiang Xingbo
>            Priority: Minor
>
> In standalone mode, if we enable external shuffle service by setting `spark.shuffle.service.enabled` to true, and then we try to start multiple workers on one host(by setting `SPARK_WORKER_INSTANCES=3` in spark-env.sh, and then run `sbin/start-slaves.sh`), we can only launch one worker on each host successfully and the rest of the workers fail to launch.
> The reason is the port of external shuffle service if configed by `spark.shuffle.service.port`, so currently we could start no more than one external shuffle service on each host. In our case, each worker tries to start a external shuffle service, and only one of them successed doing this.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org