You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2017/10/10 12:51:01 UTC
[jira] [Resolved] (SPARK-20025) Driver fail over will not work, if
SPARK_LOCAL* env is set.
[ https://issues.apache.org/jira/browse/SPARK-20025?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan resolved SPARK-20025.
---------------------------------
Resolution: Fixed
Fix Version/s: 2.3.0
> Driver fail over will not work, if SPARK_LOCAL* env is set.
> -----------------------------------------------------------
>
> Key: SPARK-20025
> URL: https://issues.apache.org/jira/browse/SPARK-20025
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 2.1.0, 2.2.0
> Reporter: Prashant Sharma
> Assignee: Prashant Sharma
> Fix For: 2.3.0
>
>
> In a bare metal system with No DNS setup, spark may be configured with SPARK_LOCAL* for IP and host properties.
> During a driver failover, in cluster deployment mode. SPARK_LOCAL* should be ignored while auto deploying and should be picked up from target system's local environment.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org