You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2015/05/08 04:49:59 UTC

[jira] [Created] (SPARK-7470) Spark shell not having hive crashes SQLContext

Andrew Or created SPARK-7470:
--------------------------------

             Summary: Spark shell not having hive crashes SQLContext
                 Key: SPARK-7470
                 URL: https://issues.apache.org/jira/browse/SPARK-7470
             Project: Spark
          Issue Type: Bug
          Components: Spark Shell, SQL
    Affects Versions: 1.3.0
            Reporter: Andrew Or
            Assignee: Andrew Or
            Priority: Critical


If hive is not found on my class path, I get the following exception and I don't get to use the SQLContext anymore. In fact, we already catch `java.lang.ClassNotFoundException` in case this happens. We just don't also catch `java.lang.NoClassDefFoundError`.

{code}
15/05/07 17:07:30 INFO BlockManagerMaster: Registered BlockManager
15/05/07 17:07:30 INFO EventLoggingListener: Logging events to file:/tmp/spark-events/local-1431043649919
15/05/07 17:07:30 INFO SparkILoop: Created spark context..
Spark context available as sc.
java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
	at java.lang.Class.getDeclaredConstructors0(Native Method)
	at java.lang.Class.privateGetDeclaredConstructors(Class.java:2493)
	at java.lang.Class.getConstructor0(Class.java:2803)
	at java.lang.Class.getConstructor(Class.java:1718)
	at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1026)
	at $iwC$$iwC.<init>(<console>:9)
	at $iwC.<init>(<console>:18)
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org