You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Robert Beauchemin (JIRA)" <ji...@apache.org> on 2015/08/17 23:16:46 UTC

[jira] [Created] (SPARK-10066) Can't create HiveContext with spark-shell or spark-sql on snapshot

Robert Beauchemin created SPARK-10066:
-----------------------------------------

             Summary: Can't create HiveContext with spark-shell or spark-sql on snapshot
                 Key: SPARK-10066
                 URL: https://issues.apache.org/jira/browse/SPARK-10066
             Project: Spark
          Issue Type: Bug
          Components: Spark Shell
    Affects Versions: 1.5.0
         Environment: Centos 6.6
            Reporter: Robert Beauchemin
            Priority: Minor
             Fix For: 1.5.0


Built the 1.5.0-preview-20150812 with the following:

./make-distribution.sh -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver -Psparkr -DskipTests

Starting spark-shell or spark-sql returns the following error: 
java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------
        at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)       ....      [elided]
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)                       

It's trying to create a new HiveContext. Running pySpark or sparkR works and creates a HiveContext successfully. SqlContext can be created successfully with any shell.

I've tried changing permissions on that HDFS directory (even as far as making it world-writable) without success. Tried changing SPARK_USER and also running spark-shell as different users without success.

This works on same machine on 1.4.1 and on earlier pre-release versions of Spark 1.5.0 (same make-distribution parms) sucessfully. Just trying the snapshot... 





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org