You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Javier Luraschi (JIRA)" <ji...@apache.org> on 2016/11/14 23:41:58 UTC

[jira] [Updated] (SPARK-18439) spark 2.0.1 fails in windows when using file:/// scratchdir in hive-site.xml

     [ https://issues.apache.org/jira/browse/SPARK-18439?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Javier Luraschi updated SPARK-18439:
------------------------------------
    Summary: spark 2.0.1 fails in windows when using file:/// scratchdir in hive-site.xml  (was: spark 2.0.1 fails in windows when using file:/// scratchfir)

> spark 2.0.1 fails in windows when using file:/// scratchdir in hive-site.xml
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-18439
>                 URL: https://issues.apache.org/jira/browse/SPARK-18439
>             Project: Spark
>          Issue Type: Bug
>          Components: Windows
>    Affects Versions: 2.0.1
>         Environment: Windows
>            Reporter: Javier Luraschi
>              Labels: windows
>             Fix For: 2.2.0
>
>
> Derby-based Hive metastore using WINUTILS.exe and hive.exec.scratchdir, hive.exec.local.scratchdir and hive.metastore.warehouse.dir with "C:\" paths works fine in Spark 2.0.0.
> However, Spark 2.0.1 throws a "No FileSystem for scheme: C" at "at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2660)"
> Changing changing hive-site.xml to use a "file:///c:\" path instead fixes this issue but throws a "Unable to create log directory file:///C:\" at "org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)" error instead.
> One note: This last exception I've seen associated when users don't run WINUTILS.exe to set the permissions. Therefore, it makes me think that it is possible that WINUTILS.exe might not support file:/// paths



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org