You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2019/02/14 07:09:00 UTC
[jira] [Assigned] (SPARK-26794) SparkSession enableHiveSupport does
not point to hive but in-memory while the SparkContext exists
[ https://issues.apache.org/jira/browse/SPARK-26794?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan reassigned SPARK-26794:
-----------------------------------
Assignee: Kent Yao
> SparkSession enableHiveSupport does not point to hive but in-memory while the SparkContext exists
> -------------------------------------------------------------------------------------------------
>
> Key: SPARK-26794
> URL: https://issues.apache.org/jira/browse/SPARK-26794
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.2
> Reporter: Kent Yao
> Assignee: Kent Yao
> Priority: Major
>
> {code:java}
> public class SqlDemo
> {
> public static void main(final String[] args) throws Exception {
>
> SparkConf conf = new SparkConf().setAppName("spark-sql-demo");
> JavaSparkContext sc = new JavaSparkContext(conf);
> SparkSession ss = SparkSession.builder().enableHiveSupport().getOrCreate();
> ss.sql("show databases").show();
> }
>
> }
> }
> {code}
> Before SPARK-20946, the demo above point to the right hive metastore if the hive-site.xml is present. But now it can only point to the default in-memory one.
> Catalog is now as a variable shared across SparkSessions, it is instantiated with SparkContext's conf. After SPARK-20946, Session level configs are not pass to SparkContext's conf anymore, so the enableHiveSupport API takes no affect on the catalog instance.
> You can set spark.sql.catalogImplementation=hive application wide to solve the problem, or never create a sc before you call SparkSession.builder().enableHiveSupport().getOrCreate()
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org