You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2015/12/19 01:07:46 UTC

[jira] [Resolved] (SPARK-12435) Installing Spark

     [ https://issues.apache.org/jira/browse/SPARK-12435?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin resolved SPARK-12435.
------------------------------------
    Resolution: Duplicate

See if the workaround works for you:
https://issues.apache.org/jira/browse/SPARK-10528?focusedCommentId=14957661&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-14957661

> Installing Spark
> ----------------
>
>                 Key: SPARK-12435
>                 URL: https://issues.apache.org/jira/browse/SPARK-12435
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 1.5.2
>         Environment: Windows 7 - 64 bit
>            Reporter: Amit
>            Priority: Blocker
>
> Hello There
> I am attempting to install Spark on Windows 7. I am able to get the Spark version 1.2.2 for hadoop 2.3 running without a problem.
> However Installing spark version 1.5.2 for hadoop 2.6, I get the following error :
> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-
>         at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
>         at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
> 		
> Please suggest.
> Thanks
> Amit



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org