You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Patrick Wendell (JIRA)" <ji...@apache.org> on 2015/04/24 22:41:38 UTC

[jira] [Commented] (SPARK-7108) Setting spark.local.dir in driver no longer overrides the standalone worker's local directory setting

    [ https://issues.apache.org/jira/browse/SPARK-7108?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14511687#comment-14511687 ] 

Patrick Wendell commented on SPARK-7108:
----------------------------------------

Ping [~vanzin] who authored SPARK-4834

> Setting spark.local.dir in driver no longer overrides the standalone worker's local directory setting
> -----------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-7108
>                 URL: https://issues.apache.org/jira/browse/SPARK-7108
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.1, 1.3.0
>            Reporter: Josh Rosen
>            Priority: Critical
>
> Prior to SPARK-4834, configuring spark.local.dir in the driver would affect the local directories created on the executor.  After this patch, executors will always ignore this setting in favor of directories read from {{SPARK_LOCAL_DIRS}}, which is set by the standalone worker based on the worker's own configuration and not the application configuration.
> This change impacts users who configured {{spark.local.dir}} only in their driver and not via their cluster's {{spark-defaults.conf}} or {{spark-env.sh}} files.  This is an atypical use-case, since the available local directories / disks are a property of the cluster and not the application, which probably explains why this issue has not been reported previously.
> The correct fix might be comment + documentation improvements.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org