You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Thomas Graves (JIRA)" <ji...@apache.org> on 2015/08/06 21:07:05 UTC
[jira] [Created] (SPARK-9701) allow not automatically using
HiveContext with spark-shell when hive support built in
Thomas Graves created SPARK-9701:
------------------------------------
Summary: allow not automatically using HiveContext with spark-shell when hive support built in
Key: SPARK-9701
URL: https://issues.apache.org/jira/browse/SPARK-9701
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 1.4.1
Reporter: Thomas Graves
I build the spark jar with hive support as most of our grids have Hive. We were bringing up a new YARN cluster that didn't have hive installed on it yet which results in the spark-shell failing to launch:
java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:374)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:116)
at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163)
at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:168)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
It would be nice to have a config or something to tell it not to instantiate a HiveContext
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org