You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2017/06/20 09:18:00 UTC
[jira] [Resolved] (SPARK-20989) Fail to start multiple workers on
one host if external shuffle service is enabled in standalone mode
[ https://issues.apache.org/jira/browse/SPARK-20989?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-20989.
---------------------------------
Resolution: Fixed
Fix Version/s: 2.3.0
Issue resolved by pull request 18290
[https://github.com/apache/spark/pull/18290]
> Fail to start multiple workers on one host if external shuffle service is enabled in standalone mode
> ----------------------------------------------------------------------------------------------------
>
> Key: SPARK-20989
> URL: https://issues.apache.org/jira/browse/SPARK-20989
> Project: Spark
> Issue Type: Bug
> Components: Deploy, Spark Core
> Affects Versions: 2.1.1
> Reporter: Jiang Xingbo
> Priority: Minor
> Fix For: 2.3.0
>
>
> In standalone mode, if we enable external shuffle service by setting `spark.shuffle.service.enabled` to true, and then we try to start multiple workers on one host(by setting `SPARK_WORKER_INSTANCES=3` in spark-env.sh, and then run `sbin/start-slaves.sh`), we can only launch one worker on each host successfully and the rest of the workers fail to launch.
> The reason is the port of external shuffle service if configed by `spark.shuffle.service.port`, so currently we could start no more than one external shuffle service on each host. In our case, each worker tries to start a external shuffle service, and only one of them successed doing this.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org