You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2016/01/24 21:30:39 UTC
[jira] [Updated] (SPARK-12120) Improve exception message when
failing to initialize HiveContext in PySpark
[ https://issues.apache.org/jira/browse/SPARK-12120?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Josh Rosen updated SPARK-12120:
-------------------------------
Assignee: Jeff Zhang
> Improve exception message when failing to initialize HiveContext in PySpark
> ---------------------------------------------------------------------------
>
> Key: SPARK-12120
> URL: https://issues.apache.org/jira/browse/SPARK-12120
> Project: Spark
> Issue Type: Improvement
> Components: PySpark
> Reporter: Jeff Zhang
> Assignee: Jeff Zhang
> Priority: Minor
>
> I get the following exception message when failing to initialize HiveContext. This is hard to figure out why HiveContext failed to initialize. Actually I build spark with hive profile enabled. The reason the HiveContext failed is due to I didn't start hdfs service. And actually I can see the full stacktrace in spark-shell. And I also can see the full stack trace in python2. The issue only exists in python2.x
> {code}
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> File "/Users/jzhang/github/spark/python/pyspark/sql/context.py", line 430, in createDataFrame
> jdf = self._ssql_ctx.applySchemaToPythonRDD(jrdd.rdd(), schema.json())
> File "/Users/jzhang/github/spark/python/pyspark/sql/context.py", line 691, in _ssql_ctx
> "build/sbt assembly", e)
> Exception: ("You must build Spark with Hive. Export 'SPARK_HIVE=true' and run build/sbt assembly", Py4JJavaError(u'An error occurred while calling None.org.apache.spark.sql.hive.HiveContext.\n', JavaObject id=o34))
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org