You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Denton Cockburn (JIRA)" <ji...@apache.org> on 2016/01/20 17:35:40 UTC

[jira] [Commented] (SPARK-12022) spark-shell cannot run on master created with spark-ec2

    [ https://issues.apache.org/jira/browse/SPARK-12022?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15108862#comment-15108862 ] 

Denton Cockburn commented on SPARK-12022:
-----------------------------------------

I encountered the same problem with Spark 1.6.0

Running
{code}
~/ephemeral-hdfs/bin/hadoop fs -chmod 777 /tmp/hive
{code}

Solved it for me.

> spark-shell cannot run on master created with spark-ec2
> -------------------------------------------------------
>
>                 Key: SPARK-12022
>                 URL: https://issues.apache.org/jira/browse/SPARK-12022
>             Project: Spark
>          Issue Type: Bug
>          Components: EC2, Spark Shell, SQL
>    Affects Versions: 1.5.1
>         Environment: AWS EC2
>            Reporter: Simeon Simeonov
>              Labels: ec2, spark-shell, sql
>
> Running {{./bin/spark-shell}} on the master node created with {{spark-ec2}} results in:
> {code}
> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx--x--x
> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
> 	at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
> 	at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162)
> 	at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160)
> 	at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org