You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ran Mingxuan (JIRA)" <ji...@apache.org> on 2017/11/20 08:09:00 UTC
[jira] [Created] (SPARK-22560) Must create spark session directly
to connect to hive
Ran Mingxuan created SPARK-22560:
------------------------------------
Summary: Must create spark session directly to connect to hive
Key: SPARK-22560
URL: https://issues.apache.org/jira/browse/SPARK-22560
Project: Spark
Issue Type: Bug
Components: Java API, SQL
Affects Versions: 2.1.0
Reporter: Ran Mingxuan
I have built a spark job like below:
{code:java}
// wrong code
public void main(String[] args)
{
SparkConf sparkConf = new SparkConf().setAppName("testApp");
JavaSparkContext sc = new JavaSparkContext(sparkConf);
sc.hadoopConfiguration().setBoolean("mapreduce.fileoutputcommitter.marksuccessfuljobs", false);
sc.hadoopConfiguration().setBoolean("parquet.enable.summary-metadata", false);//
SparkSession spark = SparkSession.builder().sparkContext(sc.sc()).enableHiveSupport().getOrCreate();
spark.sql("show databases").show();
}
{code}
and with this code spark job will not be able to find hive meta-store even it can discover correct warehouse.
I have to use code like below to make things work:
{code:java}
// correct code
public String main(String[] args)
{
SparkConf sparkConf = new SparkConf().setAppName("nice_clean");
SparkSession spark = SparkSession.builder().config(sparkConf).enableHiveSupport().getOrCreate();
SparkContext sparkContext = spark.sparkContext();
JavaSparkContext sc = JavaSparkContext.fromSparkContext(sparkContext);
sc.hadoopConfiguration().setBoolean("mapreduce.fileoutputcommitter.marksuccessfuljobs", false);
sc.hadoopConfiguration().setBoolean("parquet.enable.summary-metadata", false);
spark.sql("show databases").show();
}
{code}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org