You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kent Yao (JIRA)" <ji...@apache.org> on 2019/01/31 08:11:00 UTC

[jira] [Created] (SPARK-26794) SparkSession enableHiveSupport does not point to hive but in-memory while the SparkContext exists

Kent Yao created SPARK-26794:
--------------------------------

             Summary: SparkSession enableHiveSupport does not point to hive but in-memory while the SparkContext exists
                 Key: SPARK-26794
                 URL: https://issues.apache.org/jira/browse/SPARK-26794
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.3.2
            Reporter: Kent Yao


{code:java}
package com.netease.bdms.spark.sql;

import org.apache.spark.*;
import org.apache.spark.api.java.*;
import org.apache.spark.sql.*;
import org.apache.commons.lang3.*;

public class SqlDemo
{
    public static void main(final String[] args) throws Exception {
     
        SparkConf conf = new SparkConf().setAppName("spark-sql-demo");
        JavaSparkContext sc = new JavaSparkContext(conf);
        SparkSession ss = SparkSession.builder().enableHiveSupport().getOrCreate();
        ss.sql("show databases").show();
        }
        
    }
}
{code}
Before SPARK-20946, the demo above point to the right hive metastore if the hive-site.xml is present. But now it can only point to the default in-memory one.

Catalog is now as a variable shared across SparkSessions, it is instantiated with SparkContext's conf. After SPARK-20946, Session level configs are not pass to SparkContext's conf anymore, so the enableHiveSupport API takes no affect on the catalog instance.

You can set spark.sql.catalogImplementation=hive application wide to solve the problem, or never create a sc before you call SparkSession.builder().enableHiveSupport().getOrCreate()





--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org