You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/05/06 08:53:12 UTC

[jira] [Updated] (SPARK-3685) Spark's local dir should accept only local paths

     [ https://issues.apache.org/jira/browse/SPARK-3685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen updated SPARK-3685:
-----------------------------
    Target Version/s:   (was: 1.2.0)

> Spark's local dir should accept only local paths
> ------------------------------------------------
>
>                 Key: SPARK-3685
>                 URL: https://issues.apache.org/jira/browse/SPARK-3685
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, YARN
>    Affects Versions: 1.1.0
>            Reporter: Andrew Or
>
> When you try to set local dirs to "hdfs:/tmp/foo" it doesn't work. What it will try to do is create a folder called "hdfs:" and put "tmp" inside it. This is because in Util#getOrCreateLocalRootDirs we use java.io.File instead of Hadoop's file system to parse this path. We also need to resolve the path appropriately.
> This may not have an urgent use case, but it fails silently and does what is least expected.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org