You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jeff Zhang (JIRA)" <ji...@apache.org> on 2016/06/17 08:53:05 UTC
[jira] [Comment Edited] (SPARK-16013) Add option to disable
HiveContext in spark-shell/pyspark
[ https://issues.apache.org/jira/browse/SPARK-16013?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15335727#comment-15335727 ]
Jeff Zhang edited comment on SPARK-16013 at 6/17/16 8:53 AM:
-------------------------------------------------------------
I mean to introduce this to 1.x as in spark 2.0 we can disable hivecontext by setting spark.sql.catalogImplementation
was (Author: zjffdu):
I mean to introduce this to 1.6 as in spark 2.0 we can disable hivecontext by setting spark.sql.catalogImplementation
> Add option to disable HiveContext in spark-shell/pyspark
> --------------------------------------------------------
>
> Key: SPARK-16013
> URL: https://issues.apache.org/jira/browse/SPARK-16013
> Project: Spark
> Issue Type: Improvement
> Components: PySpark, Spark Shell
> Affects Versions: 1.6.1
> Reporter: Jeff Zhang
>
> In spark2.0, we can disable hivecontext by setting spark.sql.catalogImplementation, but in spark 1.6 it seems there's no option to turn off hivecontext. This bring several issues I can see.
> * If user spark multiple spark-shell on the same machine but don't specify hive-site.xml will cause embedded derby conflict.
> * For any spark downstream project, if they want to reuse the code in spark-shell, then they have to reply on whether hive profile is turned on to decide whether HiveContext is created. This doesn't make sense to me.
> Although spark 2.0 don't have such issue, considering most of people are still use spark 1.x, this feature would benefit users I think. I can create PR if this feature is reasonable for users.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org